Okay this probably sounds like a physics question, but there's actually a geosynchronous satellite question in our assignment...for CompSci...

The question asks about how long it takes for a transmission from earth to a satellite which then is transmitted to another point on earth take

what I'm confused about is that is there some sort of special equation to calculate the time taken for a signal to be sent through to space and back? or can we just use v=d/t (v=velocity d=distance t=time) ??

Note: We were actually given the velocity, distance of satellite to earth

or am I spimly thinking too much and should just go with v=d/t ?

Well, I know that electromagnetic signals travel at the speed of light. For an accurate reading, the formula v = d/t is not often used when the velocity is that of light. I don't know the formula off hand, but it's a special case I think. I mean, you can use it, but it won't be accurate. I think it has to do with mass, because the mass of the object increases exponentially as it reaches the speed of light. Don't remember though.

Anyways, yeah you can use it, but it won't be accurate. Also, you have the atmosphere layers of the earth and static to worry about. They'll slow down your signal. Anyways, you can still give the formula v = d/t a shot and use v = 3.0 x 10^8 m/s. You'd have to know the distance from the satellite. Do it twice and add them both up.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.