0

Okay this probably sounds like a physics question, but there's actually a geosynchronous satellite question in our assignment...for CompSci...

The question asks about how long it takes for a transmission from earth to a satellite which then is transmitted to another point on earth take

what I'm confused about is that is there some sort of special equation to calculate the time taken for a signal to be sent through to space and back? or can we just use v=d/t (v=velocity d=distance t=time) ??

Note: We were actually given the velocity, distance of satellite to earth

or am I spimly thinking too much and should just go with v=d/t ?

2
Contributors
1
Reply
2
Views
13 Years
Discussion Span
Last Post by samaru
0

Well, I know that electromagnetic signals travel at the speed of light. For an accurate reading, the formula v = d/t is not often used when the velocity is that of light. I don't know the formula off hand, but it's a special case I think. I mean, you can use it, but it won't be accurate. I think it has to do with mass, because the mass of the object increases exponentially as it reaches the speed of light. Don't remember though.

Anyways, yeah you can use it, but it won't be accurate. Also, you have the atmosphere layers of the earth and static to worry about. They'll slow down your signal. Anyways, you can still give the formula v = d/t a shot and use v = 3.0 x 10^8 m/s. You'd have to know the distance from the satellite. Do it twice and add them both up.

This topic has been dead for over six months. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.