I need help understanding the following

Machine A runs a program in 10s at 400MHZ
We want machineB to run the same program in 6s.

Machine B will require 1.2 * clock cycle than machine A.

Why is it 1.2? I can't figure it out.

Anyone?

I get a factor of 1.67 . If you need to explain that 1.2 is the correct factor, then you have to say that something else is important. Maybe I/O on the new machine is faster or the new machine has faster/larger CPU-side cache or ...

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.