Solving User Time Greater Than Real Time Problem

  • Thread starter FrostScYthe
  • Start date
  • Tags
    Timer
In summary, the individual is experiencing strange results while measuring the time for an application using source code found online. The user time is greater than the real time, and the sum of the user and system time is not close to the real time. They are using a source code for measuring time and are also using a timer class. The individual further analyzed the issue and noticed a significant difference in the time when inserting rows in different table types, such as MyISAM and InnoDB or Berkeley DB. They suspect this could be due to the transactional nature of InnoDB and BDB. They are seeking advice on what could be causing this issue.
  • #1
FrostScYthe
80
0
Hi everyone,

I have been measuring some times for an application I'm doing, using some source code I found on the internet for measuring system time and user time used by an application. I am getting some very strange results though, and I wanted to ask if they were possible or were maybe due to some programming error.

The results

User time greater than real time, that's weird

Reading from Dataset
User time was: 19.333 /
System time was: 1.29179
Real time was: 18

Their sum isn't close to the real time, where is this time being spent?
Inserting to Database with engine MyISAM
User time was: 0.396942
System time was: 0.849879
Real time was: 9

Their sum isn't close to the real time, where is this time being spent?
Inserting to Database with engine InnoDB
User time was: 0.440937
System time was: 1.70573
Real time was: 211

This is the source I'm using to measure time
Code:
/* -------------------------------------------------------------

   Timer.hpp

   Implementation of a Timer class.

   (P)1999-2000  Laurentiu Cristofor

   -------------------------------------------------------------*/

#ifndef _TIMER_HPP_
#define _TIMER_HPP_

extern "C"
{
#include <stdio.h>
#include <string.h>
#include <sys/types.h>
#include <sys/time.h>
#include <sys/resource.h>
}

#include <iostream>

// This class is a container for time results
class TimeElapsed
{
 friend std::ostream& operator<<(std::ostream& os, TimeElapsed& t);

private:
  long userTime, systemTime, realTime;

public:
  TimeElapsed(long user, long system, long real)
    {
      userTime = user;
      systemTime = system;
      realTime = real;
    }

  long getUserTime()
    {
      return userTime;
    }

  long getSystemTime()
    {
      return systemTime;
    }

  long getRealTime()
    {
      return realTime;
    }
};
class Timer
{
 private:
  time_t begin, end;
  struct rusage beginusage, endusage;
  // the time counted so far
  long utime, stime, rtime;

  // indicate whether the timer has been started
  bool started;

  // if true then return number of microseconds
  // for user and system time;
  // else return number of seconds.
  bool precision; 

 public:
  // if precision is set to true, then the real and system time
  // will be measured in microseconds
  Timer(bool precision = false)
    {
      utime = stime = rtime = 0;
      started = false;
      this->precision = precision;
    }

  // start timer, if already started then do nothing
  void start()
    {
      if (started)
	return;

      started = true;
      begin = time(NULL);
      if (getrusage(RUSAGE_SELF, &beginusage) == -1)
	puts("getrusage error!");
    }

  // stop timer and return time measured so far.
  // if timer was stopped the time returned will be 0.
  TimeElapsed stop()
    {
      if (!started)
	return TimeElapsed(0, 0, 0);

      if (getrusage(RUSAGE_SELF, &endusage) == -1)
	puts("getrusage error!");
      end = time(NULL);
      started = false;

      if (precision)
	{
	  long uusecs = (endusage.ru_utime.tv_sec 
			 - beginusage.ru_utime.tv_sec) * 1000000 
	    + endusage.ru_utime.tv_usec - beginusage.ru_utime.tv_usec;
	  utime += uusecs;

	  long susecs = (endusage.ru_stime.tv_sec 
			 - beginusage.ru_stime.tv_sec) * 1000000 
	    + endusage.ru_stime.tv_usec - beginusage.ru_stime.tv_usec;
	  stime += susecs;
	}
      else
	{
	  long usecs = (endusage.ru_utime.tv_sec 
			- beginusage.ru_utime.tv_sec);
	  utime += usecs;

	  long ssecs = (endusage.ru_stime.tv_sec 
			- beginusage.ru_stime.tv_sec);
	  stime += ssecs;
	}

      rtime += (end - begin);

      return TimeElapsed(utime, stime, rtime);
    }

  // reset the timer, this will reset the time measured to 0 and
  // will leave the timer in the same status (started or stopped).
  void reset()
    {
      utime = stime = rtime = 0;

      if (started)
	{
	  begin = time(NULL);
	  if (getrusage(RUSAGE_SELF, &beginusage) == -1)
	    puts("getrusage error!");
	}
    }

  // return time measured up to this point.
  TimeElapsed getTime()
  {
    if (!started)
      return TimeElapsed(utime, stime, rtime);

    if (getrusage(RUSAGE_SELF, &endusage) == -1)
      puts("getrusage error!");
    end = time(NULL);

    if (precision)
      {
	long uusecs = (endusage.ru_utime.tv_sec 
		       - beginusage.ru_utime.tv_sec) * 1000000 
	  + endusage.ru_utime.tv_usec - beginusage.ru_utime.tv_usec;

	long susecs = (endusage.ru_stime.tv_sec 
		       - beginusage.ru_stime.tv_sec) * 1000000 
	  + endusage.ru_stime.tv_usec - beginusage.ru_stime.tv_usec;

	return TimeElapsed(utime + uusecs, 
			   stime + susecs, 
			   rtime + end - begin);
      }
    else
      {
	long usecs = (endusage.ru_utime.tv_sec 
		      - beginusage.ru_utime.tv_sec);
	
	long ssecs = (endusage.ru_stime.tv_sec 
		      - beginusage.ru_stime.tv_sec);
	
	return TimeElapsed(utime + usecs, 
			   stime + ssecs, 
			   rtime + end - begin);
      }
  }

  bool isStarted()
  {
    return started;
  }
};

inline std::ostream& operator<<(std::ostream& os, TimeElapsed& t)
{
    float userTime = t.getUserTime()/1000000.0;
    float sysTime  = t.getSystemTime()/1000000.0; 
    
    os << "User time was: " << userTime << std::endl
       << "System time was: " << sysTime << std::endl
       << "Real time was: " << t.getRealTime() << std::endl;

    return os;
}

#endif // _TIMER_HPP_
 
Technology news on Phys.org
  • #2
I analyzed this further, and I don't know if maybe someone knows what could be happening. With the previous program I was sending a query inserting rows to a table in MySQL, row by row (each row gets an insert query), and on the latest version I created an insert query with 500 rows. There is no major difference in inserting row by row in the MyISAM table type, but when doing that with InnoDB or Berkeley DB the times shoot up awfully high. Could this be attributed to the fact that InnoDB and BDB are transactional?

o_O?
 
  • #3



Based on the source code provided, it appears that the user time and system time are being measured using the getrusage() function. This function uses the system clock to measure the amount of time used by a process. However, this method is not always accurate, as it may not account for time spent waiting for system resources or other processes to finish. This could explain why the sum of the user and system times does not match the real time in some cases.

Additionally, the precision of the time measurements can also affect the accuracy of the results. If the precision is set to true, the times will be measured in microseconds. However, if it is set to false, the times will be measured in seconds. This could also account for discrepancies between the measured times.

It is also possible that there is a programming error in the source code being used, which could be causing incorrect time measurements. It may be helpful to review the code and ensure that it is correctly calculating and reporting the times.

In order to accurately measure the time used by an application, it may be beneficial to use a more precise and reliable method, such as a profiler tool specifically designed for this purpose. This can provide more detailed and accurate information about the time spent in different parts of the application.
 

Related to Solving User Time Greater Than Real Time Problem

1. What is the user time greater than real time problem?

The user time greater than real time problem refers to a situation where the amount of time a user spends on a task or activity is longer than the actual time it takes to complete the task. This can occur due to various factors such as distractions, multi-tasking, or inefficient use of time.

2. How does this problem affect productivity?

This problem can significantly impact productivity as it leads to a mismatch between the time allocated for a task and the actual time needed to complete it. This can result in delays, missed deadlines, and a decrease in overall efficiency.

3. What are some common causes of user time greater than real time?

Some common causes include poor time management skills, lack of focus, interruptions, and unrealistic expectations. It can also be a result of not fully understanding the task or having to navigate through complex or unfamiliar systems.

4. How can this problem be solved?

There are several strategies that can help solve the user time greater than real time problem. These include setting clear goals and priorities, minimizing distractions, breaking down tasks into smaller chunks, and using time management techniques such as the Pomodoro method. It is also important to regularly evaluate and adjust one's approach to time management.

5. What are the benefits of effectively managing user time?

Effective time management can lead to improved productivity, better work-life balance, reduced stress, and increased satisfaction with one's work. It also allows for more time to be allocated to important tasks and activities, resulting in better overall outcomes and achievements.

Back
Top