Sponsored By

Sign in to follow this  
Matt

For Loops With Time

Recommended Posts

I was wondering if there is a syntax of the for statement in C++ that makes it loop for a set amount of time. like for(2 hours) do this. I'm sure there's a way to make thinfs happen for a set ammount of time... If it is too complicated then I may not use it, but id still like to know.

Matt

Share this post


Link to post
Share on other sites

No syntax, but you can bang something together using the <ctime> functions. Here's a (typically stupid) busy-wait function

#include <ctime>

using namespace std;

void busy_wait(unsigned sec)
{
   for (clock_t t = clock(); (clock() - t)/CLOCKS_PER_SEC < sec; )
     ;
}

(IMO this is more natural with a while loop.)

Edited by jcl

Share this post


Link to post
Share on other sites
No syntax, but you can bang something together using the <ctime> functions. Here's a (typically stupid) busy-wait function

#include <ctime>

using namespace std;

void busy_wait(unsigned sec)
{
   for (clock_t t = clock(); (clock() - t)/CLOCKS_PER_SEC < sec; )
    ;
}

(IMO this is more natural with a while loop.)

Hi jcl,

so say I wanted it to print on the screen the value of int a for 10 seconds. Would this be my code?

#include <ctime>

using namespace std;

int a;
a = 5;

void busy_wait(unsigned sec)
{
  for (clock_t t = clock(); (clock() - t)/CLOCKS_PER_SEC < 10; )
   ;
cout << a;
}
else;
cout << "I printed the value of A for 10 seconds";
cout << endl;

Matt

Share this post


Link to post
Share on other sites

D'oh. Forget what I posted, it doesn't work. I didn't read the description of clock() closely enough and didn't test the code well. The only portable way do what you want is to use difftime() or to build something on gmtime() or localtime(). For example

#include <ctime>
#include <iostream>

using namespace std;

int main(void)
{
   int a = 5;
   for (time_t t = time(0); difftime(time(0), t) < 10; ) {
       cout << a;
   }
   
   return 0;
}

If you don't care about portability, most operating systems provide more useful time-related functions. For example, on POSIX systems the function time() returns the current time in seconds since Epoch. So you could write

#include <ctime>
#include <iostream>

using namespace std;

int main(void)
{
   int a = 5;
   for (time_t t = time(0); t + 10 > time(0); ) {
       cout << a;
   }
   
   return 0;
}

Note that neither of these loops will run for exactly 10 seconds; there's a margin of error of about 1s either way depending on how the tests are written. If you need really accurate timing you have to use something more complex. And it still won't be absolute accurate.

Edited by jcl

Share this post


Link to post
Share on other sites
wow....this brings back memories when i was learning python...For time, python had perfect syntax....sorry, but i've forgotten much of my python by now...

Well, almost anything is better than C and C++.

Edited by jcl

Share this post


Link to post
Share on other sites
D'oh. Forget what I posted, it doesn't work. I didn't read the description of clock() closely enough and didn't test the code well. The only portable way do what you want is to use the use difftime() or to build something on gmtime() or localtime(). For example

#include <ctime>
#include <iostream>

using namespace std;

int main(void)
{
   int a = 5;
   for (time_t t = time(0); difftime(time(0), t) < 10; ) {
       cout << a;
   }
   
   return 0;
}

If you don't care about portability, most operating systems provide more useful time-related functions. For example, on POSIX systems the function time() returns the current time in seconds since Epoch. So you could write

#include <ctime>
#include <iostream>

using namespace std;

int main(void)
{
   int a = 5;
   for (time_t t = time(0); t + 10 > time(0); ) {
       cout << a;
   }
   
   return 0;
}

Note that neither of these loops will run for exactly 10 seconds; there's a margin of error of about 1s either way depending on how the tests are written. If you need really accurate timing you have to use something more complex. And it still won't be absolute accurate.

ok thanks. So how would I go about changing it to minutes or even hours? If say, I did it in hours, would the margin of error be an hour? or still a second?

Also,

Well, almost anything is better than C and C++.

Is C++ a bad language to spend time learning?

Matt

Share this post


Link to post
Share on other sites
ok thanks.  So how would I go about changing it to minutes or even hours?

Multiply by 60 or 3600 :)

    // Loop for an hour and a quarter
   for (time_t t = time(0); difftime(time(0), t) < (3600 + 15 * 60); ) {
      cout << a;
  }

If say, I did it in hours, would the margin  of error be an hour? or still a second?

The error is determined by the granularity of the clock. If you build the timing code on top of a second-based clock it should be accurate to within a second or so. 'course you'd need to test it to be sure (rule to live by: test everything) which could be a problem if the timeout is quite long.

You also have to factor in the time it takes to execute the code within the loop.

There are better ways to do this, but they're non-portable and complicated. Polling the clock every iteration is a stupid brute-force solution that should only be used if absolutely necessary. Like if you're writing pure ISO C++.

Is C++ a bad language to spend time learning?

Not at all. It's good to learn it. As they say, know thine enemy.

This is something you'll pick up eventually, by the way. Insulting programming languages, I mean. It comes with the territory. If you're bored, poke around here for a few minutes.

Edited by jcl

Share this post


Link to post
Share on other sites
Well, almost anything is better than C and C++.

Is C++ a bad language to spend time learning?

Matt

I just go something to add here as well....:

"C is very efficient, and very sparing of your machine's resources. Unfortunately, C gets that efficiency by requiring you to do a lot of low-level management of resources (like memory) by hand. All that low-level code is complex and bug-prone, and will soak up huge amounts of your time on debugging. With today's machines as powerful as they are, this is usually a bad tradeoff — it's smarter to use a language that uses the machine's time less efficiently, but your time much more efficiently. Thus, Python."

From one of my sources that explains how to be a hacker--not a cracker.

http://www.catb.org/~esr/faqs/hacker-howto.html

Most of the stuff there is just ethics, but if you go to the place where he starts talking about knowledge, its quite interesting.

Share this post


Link to post
Share on other sites

Thanks for the info. I was just thinking.. if I changed

    for (time_t t = time(0); difftime(time(0), t) < (3600 + 15 * 60); ) {
     cout << a;
 }

to

    for (time_t t = time(0); difftime(time(0), t) =< (3600 + 15 * 60); ) {
     cout << a;
 }

would that get rid of the 1 secong margin?

Matt

Share this post


Link to post
Share on other sites
Thanks for the info.  I was just thinking.. if I changed

    for (time_t t = time(0); difftime(time(0), t) < (3600 + 15 * 60); ) {
     cout << a;
 }

to

    for (time_t t = time(0); difftime(time(0), t) =< (3600 + 15 * 60); ) {
     cout << a;
 }

would that get rid of the 1 secong margin?

Nope. The error is a caused by a combination of the rate at which the clock is polled and the lack of synchronization between the clock and the program. It's very, very unlikely that the program will check the clock exactly once per second at the exact moment when the clock rolls over to a second. You can make it more accurate by increasing the polling rate but it's not going to be perfect.

There are also environmental factors. For example, preemptive multitasking really screws with timing. It's possible that there will be a context switch between the clock poll and the execution of the loop, so the iteration that begins at 9.999... seconds may not complete until some point far in the future.

Changing the test may have a undesirable effect. It will make it more likely that the program will run for more than 10 seconds. When I doubt I favor having events arrive early than late; it's easier to wait a bit longer than to go back in time.

The short story is that you aren't going to get precise timing out of code like this. There are real time systems that would enable you to do it, but that's a whole computing discipline in its own right.

Edited by jcl

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
Sign in to follow this