0

IT Friends,

Let me start by saying I am not an IT professional. Our company recently changed to a cloud based environment. We are seeing substantial efficiency loss in the calculation times of our templates (mostly excel based). It's typically only seconds per calculation, but adds up throughout the day. My current desktop has the following set up: Windows 10, Office 16, Intel(R) Core(TM) i7-4790 CPU @3.60 GHz 16.0GB RAM 64-bit operating system. I am being told that no cloud environment can match the processing speed for excel calculations of my desktop. I find that very hard to believe. Is that true and I just don't understand because I'm not an IT professional? My thought on cloud development was that the point was to be able to match your home computing speed no matter where you were. It wouldn't make sense in my mind with IT advancements tha we are moving backward in processing speed. Our current cloud environment is running through Citrix. If there is any other information you'd need to know, please ask and I will find it out.

3
Contributors
2
Replies
12
Views
1 Month
Discussion Span
Last Post by pty
1

There's no question here. But I'll guess you want to know if a Citrix system could match your current run of the mill i7 based PC. In your test, no it didn't fare well but we don't know what the Citrix server is on. Until we know what powers the server it's easy to see for your use it's slower.

Your management should have no problem with you continuing to use your system and using the cloud system when its needed.

Citrix at first was not called cloud computing so that came later. It was more of a labeling effort rather than what I call cloud computing. In my view cloud computing can be a true cloud where your compute job can be spread over many CPUs in the compute farm. For Citrix this farm is usually pretty small and is sized to meet the meager needs rather than something for a power user.

Your Citrix server is most likely suited for Word and email use from the sounds of it.

1

I develop on an average machine (a 2014 MBP, i5 with 8GB RAM). I don't need any more grunt because I have AWS at my fingertips. I can have a cluster of massively powered machines at a very reasonable price - you only pay for them while you use them.

If, however, I was a gamer, where latency is a key issue, I'd want that power a bit closer. But, I'd need to spend more money even if I only played three hours a week.

Pros and cons with both approaches.

Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.