Which factors of a PC determine the opening time of a program and a large file

In summary: I guess.Reading/writing speed (well, usually/hopefully just the former for the question described) is the starting... well, bottleneck, I guess.
  • #1
mech-eng
828
13
TL;DR Summary
Analyzing the opening time of a computer program or a large file
Hello. Would you please explain the factors that determines the opening time of a program or a big file. For example if you have a large excel file, why would you have a difficulty to open it, if it is very large? If it is in your PC's main storage, that is either HDD or SDD, is it reading speed or writing speed that determines the opening time of the file and program? (For example, opening time of a CFD program might take relatively long, and so is boring.)

Regards,
 
Last edited:
Computer science news on Phys.org
  • #2
What do you think? What has your research turned up?
 
  • Like
Likes phinds, pbuk, Vanadium 50 and 1 other person
  • #3
mech-eng said:
(For example, opening time of a CFD program might take relatively long, and so is boring.)
I couldn't understand how something can be boring based on how much time it takes to open.:rolleyes:
 
  • #4
Wrichik Basu said:
I couldn't understand how something can be boring based on how much time it takes to open.:rolleyes:

Sorry for poor statement. Waiting for it to open is boring.
 
  • #5
DaveC426913 said:
What do you think? What has your research turned up?

I couldn't find what I am looking for. I have found a similar question and the answers were mostly about switching from HDD to SSD, and about RAMs. My researches didn't gave anything about writing\reading speeds and opening time. One source mentioned about SSD technology. For example one of them.

Boot time using a solid-state drive averages about 10-13 seconds as compared to 30-40 seconds for a hard drive. Because SSDs use nonvolatile storage media that stores persistent data on solid-state flash memory, file copy/write speeds are faster as well.

https://quantumpc.com/upgrading-sol... time using a solid,speeds are faster as well.
 
  • #6
Use TaskManager on Windows. Look at the % usage reported for CPU, memory, and disk before starting this big program. Obviously, they have some influence on the time.

Perhaps to make room for this new program, previous memory users must be paged out by the virtual memory system. Perhaps there is not much CPU time available.
 
  • Like
Likes FactChecker
  • #7
Excel is a bit unpredictable.

Obviously Excel needs to read the file, but it also tends to write a backup to disk, making writing speed relevant. If the sheet has lots of formulas, it will also take CPU speed and memory speed. It depends on the file and its content what the bottleneck will be.
 
  • Like
Likes Tom.G
  • #8
You can open the performance monitor and see what the bottleneck(s?) is/are. You should look at the disk activity, the CPU activity, and the memory % used. Any of those can be the cause of serious slowdowns.

To bring up a performance display, you can type in <Ctrl><Alt>Delete (all 3 keys at once. A menu should pop up and select "Task Manager". Then select the "Performance" tab.
 
  • #9
If you have 32 bit Excel then it can only use 2GB of RAM. This can make opening a very large spreadsheet exceedingly slow as chunks are swapped between RAM and disk/SSD as it loads and calculates.
 
  • Informative
Likes FactChecker and anorlunda
  • #10
Thank you for the replies. From viewpoint of the driver (HDD or SSD) what factors affect the situation? Reading speed? Writing Speed? Cache of the driver? If any, other things? I think one factor is that how full is the driver. For example 10% full of a driver would open the same file and program more easily than %90 full of the same driver. Are those unrelated, unconcerned factors?
 
Last edited:
  • #11
mech-eng said:
Thank you for the replies. From viewpoint of the driver (HDD or SSD) what factors affect the situation? Reading speed? Writing Speed? Cache of the driver? If any, other things? I think one factor is that how full is the driver. For example 10% full of a driver would open the same file and program more easily than %90 full of the same driver. Are those unrelated, unconcerned factors?
Have you looked at the Performance Monitor and determined that I/O to a drive is the bottleneck?
 
  • #12
FactChecker said:
Have you looked at the Performance Monitor and determined that I/O to a drive is the bottleneck?

Now, I don't have a so big file. I am asking the question as a general one. There is no bottleneck for now.
 
  • #13
mech-eng said:
I am asking the question as a general one.
Well, the 'opening time' also depends on the system overhead. A well messed up battlefield of various antivirus/antimalware softwares and bugs, built on the wasted chunks of several driver updates and 'optimization softwares' may make any work a nightmare regardless the hardware below.
 
Last edited:
  • Haha
Likes pbuk
  • #14
Rive said:
Well, the 'opening time' also depends on the system overhead. A well messed up battlefield of various antivirus/antimalvare softwares and bugs, built on the wasted chunks of several driver updates and 'optimization softwares' may make any work a nightmare regardless the hardware below.

But does it have to do with "writing\reading speed"?
 
  • #15
mech-eng said:
But does it have to do with "writing\reading speed"?
...
is it reading speed or writing speed that determines the opening time of the file and program?
Reading/writing speed (well, usually/hopefully just the former for the question described) is the starting point of the thing, but a lot can depend on the circumstances too. For example your CFD program likely won't just grab the data from the disk, but do a lot of unpacking and data manipulation (CPU and memory intensive tasks) to provide convenient environment for computation instead of storage before it's ready for operation: while a fileserver would just grab the data and push it through the network interface with minimal fuss. Different tasks, different factors.
But R/W speed is, where it all starts.
 
  • Informative
Likes mech-eng
  • #16
Rive said:
Reading/writing speed (well, usually just the former) is the starting point of the thing,

So bigger writing speed faster opening of a program and a large file?
 
  • #17
mech-eng said:
So bigger writing speed faster opening of a program and a large file?
It's supposed to be 'reading', but: without knowing the circumstances, there is no answer for that.
Faster reading may allow faster opening, while slow reading will sure to make it slow.
 
  • #18
This is a a bit more complex situation: If the file is in an SD card or flash stick, what would be? Does putting the file in an SD card or in a flash stick delay the opening time?
 
  • #19
Yes. For such cases even the interface might be important: a 'stick on an USB2 or USB3 port might be like hell or heaven.

Best to copy it all to the main storage first if possible, or at least get a fast card/stick on a fast port.
 
Last edited:
  • Informative
Likes mech-eng
  • #20
What fields and textbooks deal with this topic? Hardware? Communication? PC communication? PC architecture? Electronics?
 
  • #21
mech-eng said:
What fields and textbooks deal with this topic?
If you want to get Down-and-Dirty, here is a good start:

Computer Architecture (including comparisons/tradeoffs between between different CPU architectures)

Computer hardware design (storage devices [both external and RAM], interfaces to them)

Computer Science (Operating System Design)

Expect a few years to get thru the above.

Have Fun!
Tom
 
  • Informative
Likes mech-eng
  • #22
File defragmentation might be an issue on older drives.

other things to consider:
Software plugins (is there a safe mode?),
license managers,
networking (e.g. file access on network drives, software-configuration of network printers).
[If you can run the file without a network, try opening the file with networking turned off.]

Could it just be the software? (Try an alternate version (old version or updated version). Try another computer.)
Or could it be a problem with the particular file that is being opened?

If the issue is file size, can you copy the file and another file of the same size?
Are the results similar?
 
  • Informative
Likes mech-eng
  • #23
robphy said:
File defragmentation might be an issue on older drives.
File fragmentation might be an even bigger issue. :oldbiggrin:
 
  • Like
  • Sad
Likes hutchphd and robphy
  • #24
robphy said:
If the issue is file size, can you copy the file and another file of the same size?
Are the results similar?

Yes, issue is file size. Copy? Why is to copy? The problem is just "opening time.". That's all. And thanks for all of those possibilities.
 
  • #25
mech-eng said:
Why is to copy? The problem is just "opening time.". That's all.
And - er - how is that working out for you? Not well?

See, the point robphy is trying to address is to examine the file with a less direct approach, to see if there's something peculiar about this specific file. Copying it would provide useful information that might open other avenues of analysis.
 
  • #26
DaveC426913 said:
Copying it would provide useful information that might open other avenues of analysis.
For instance, if you hear some "clicks" coming from the computer while copying you likely have a failing disc drive that keeps re-trying to either read or write.

Or if you can hear the disk drive rapidly seeking (positioning to various tracks), the system may be low on memory... or it may be another clue of a failing disk drive.

If the file is rapidly copied with no problems, the program you are trying to run may be spending much time processing or parsing the data as it comes in, the program may not be reading sequential data from the file - forcing many disc seeks to different places in the file, or again, your system may be low on memory.

That's for starters off the top of my head. I'm fairly sure others here can come up with many more possibilities.

Cheers,
Tom
 
  • #27
FactChecker said:
Have you looked at the Performance Monitor and determined that I/O to a drive is the bottleneck?
As relevant question, If the bottleneck is either disk or CPU, is doubling the RAM a meaningful act? If you are having issues related to CPU or disk, to double the RAM is a bad idea?
 
  • #28
mech-eng said:
As relevant question, If the bottleneck is either disk or CPU, is doubling the RAM a meaningful act? If you are having issues related to CPU or disk, to double the RAM is a bad idea?
Meaningful act? - probably not. Best case a bit expensive; worst case, more time for a catastrophic disk failure.

(In my tower computer, disks seem to last between 10,000 and 30,000 hours, then spindle or positioner bearings wear out and/or the read/write heads start contacting the disk surface (that scrapes all the data off). I have 4 drives on the shelf waiting for me to liberate those super-strong magnets from the head positioner; great refrigerator magnets, but don't get your fingers pinched! The disk platters themselves make good aberration-free, front surface mirrors.)

Cheers,
Tom
 
  • Like
Likes hutchphd
  • #29
mech-eng said:
If the bottleneck is either disk or CPU, is doubling the RAM a meaningful act?
If it's the disk, the no. If it's the CPU, then maybe. Many CPU intensive tasks are sensitive to the amount of memory too.
 
  • Informative
Likes mech-eng
  • #30
Rive said:
If it's the disk, the no. If it's the CPU, then maybe. Many CPU intensive tasks are sensitive to the amount of memory too.

This means RAM has an effect on CPU performance. Is the opposite also true? Would you please basically explain those thing, introducing a few technical terms?
 
  • #31
mech-eng said:
Would you please
Would you please elaborate the problem?
We are shooting in the dark here for almost a week long already, and it's not really productive this way.
 
  • #32
Rive said:
We are shooting in the dark here for almost a week long already, and it's not really productive this way.
Sorry for that.

Rive said:
Would you please elaborate the problem?

The problem was from the past but a few weeks later I might encounter the same problem. Then I can give detailed knowledge for my specific problem. But I thought my second question about RAM and CPU interaction was a clear and general one.
 
  • #33
FactChecker said:
Have you looked at the Performance Monitor and determined that I/O to a drive is the bottleneck?
mech-eng said:
As relevant question, If the bottleneck is either disk or CPU, is doubling the RAM a meaningful act? If you are having issues related to CPU or disk, to double the RAM is a bad idea?
That is not an answer. Why can't you give us more information? Is this something that happened a while ago and you can not replicate it?
 
  • #34
FactChecker said:
Why can't you give us more information? Is this something that happened a while ago and you can not replicate it?

Yes. it had happened a while ago, in the past. I will give you the information from the Performance Monitor in the future when my file become larger. I don't have a such big file for now.
 
  • Like
Likes FactChecker
  • #35
mech-eng said:
Yes. it had happened a while ago, in the past. I will give you the information from the Performance Monitor in the future when my file become larger. I don't have a such big file for now.
That explains it. If you have the opportunity now, it might be wise to make a simple, large file to test it. Otherwise, when you have a real file that large, you might not have time to fix anything.
 
Back
Top