I have a few doubts related to execution time ..(though they are not much related to eachother.)
1.I submitted these two solutions for kingcon problem (in APRIL13) first and second The only difference being that in the first an extra command is there ("cout<<"dfs";)which is executed at most once for a given test case.
For first one time: 0.37 mem: 2.9M (Wrong Answer).
For second one time: (TLE) mem:28.2M (TLE). Why it is so??
2.If we use say unsigned long long int in place where just int would suffice why the time taken increases?
3.Suppose the time limit given for a given problem is 2 sec. and we are considering an algorithm of O(n) and say n is of order 10^8. Let us assume that a processor that we use processes about 10^8 instructions per sec.Now,the total no of instructions in our code may be 2x10^8 to 5x10^8.So, how do we come to know that the code will execute in 2 sec or 5 sec ,when we don't now the actual speed of processor.