The Software Purist |



The Internet on Mars

Lately, I’ve seen a few programs on National Geographic, which talk about life on other worlds, including the Moon and Mars. One, in particular, that I saw recently was this one, which discusses the colonization and terraforming of Mars.  I find this really interesting, as it’s clearly a lengthy and complicated process, especially considering we’ve never had a person on Mars yet and two-thirds of all attempts to land on Mars have been unsuccessful.  Then, I got to thinking of something way more important (he says with a smirk): How would the internet work on Mars?

Consider this for a moment.  Let’s assume we could set up a similar system on Mars as we have on Earth.  Surely, there will be the desire to view webpages on Earth from Mars and vice versa.  Yet, with current technology, this would be very difficult, as we have both bandwidth and latency limitations many orders of magnitude much slower than we have from Earth to Earth or even Earth to the Moon.  Let’s explore the problem in more detail.

Distance from Earth to Mars

At its closest ever recorded distance in 2003, Mars was just under 56 million km from Earth.  At its furthest, the difference can be about 380 million km.  So, let’s do some math.  Assuming that traffic can be broadcast at the speed of light:

lowest time = displacement / velocity = least distance / speed of light = 56 x 10^9 m / 3 x 10^8 m/s = 186 s = 3:06 minutes

highest time = displacement /velocity = greatest distance / speed of light = 380 x 10^9 m / 3 x 10^8 m/s = 1267 s = 21:07 minutes

The Problem

To generalize a bit as to not overstate the problem, is that internet protocols generally do handshaking at lower levels.  But, of course, if you want to view the news on, this cannot work.  Most protocols will time out quicker than these latency amounts, so this cannot work.  AFAIK, HTTP uses TCP/IP, which, of course, would be problematic for this case.  If we had larger timeouts, there would still be large delays between grabbing blocks of data, so this solution doesn’t seem particularly feasible either.  From this, we know we would have to use some sort of offline caching and require a different protocol.  Logging into a server on a different planet would not really be feasible, with current technology.  But that doesn’t mean that getting a static, delayed view is unfeasible.

The Solution

For this part of the solution, I’m going to cheat just a little bit.  Let’s assume latency is an issue that we cannot solve with current technology, but let’s assume infinite bandwidth.  Of course, having infinite bandwidth isn’t really true with current technology, but this is something we have better prospects at increasing as technology improves, in the near term.  So, we know we can’t use TCP, so we would need to have UDP communication.  My solution is as follows: For pages to be viewed “offline” on both planets, the respective Internets of these planets would need to have central servers that store a cached copy of the pages.  It would be the responsibility of the planet to broadcast the offline data to the other planet, so that the cache could be updated.  Of course, we already do this with the Mars rover, but this is at a larger scale, since it would provide some subset, or hopefully cached views of the entire internet.

Perhaps the broadcast is a constant process.  A spider would go through, just like Google’s spiders do, grab cached versions of pages for offline viewing.  Then, this is broadcast to the other planet.  On the receiving side, there would be some central site for each side’s internet, which is used for displaying all the cached pages, and perhaps there would be alternate versions of search engines for searching cached pages, rather than real pages.  Perhaps, a new added extension, such as .mars (so it would make a mars site such as The Software Purist as if you’re viewing a page from Earth and vice versa if you’re viewing a page from Mars that is cached from an Earth page.  An open question would be whether it would ever be realistic to download images, Flash SWFs, and other types of media, as well.


Ultimately, the Internet on Mars would work similarly to the Internet on Earth.  But, when viewing the “InterSpaceNet”, it would work a little different.  Interactive communication is not possible with current technology.  Current protocols are somewhat insufficient for communication, due to latency concerns.  I went through ways in which this could work through an offline caching mechanism.  I suppose this is a slightly strange topic to think about, but it was something I wondered and, hopefully, you’ll find it interesting.  Please let me know what you think of my conclusions and if you have different ideas.

· · · ·


  • Salaar · June 27, 2010 at 3:55 am

    nice idea.. but i wanted to ask about what will you propose we do when the direct link between the mars and earth is disturbed for sometime when sun comes in between earth and mars?

    Use MarsSat Program?

    And would u clarify the infinite bandwidth??



  • Lance W · January 5, 2011 at 11:18 pm

    I imagine for going around the sun, you could just have some relay satellites that orbit the sun as close as possible without getting interference, enough satellites so there are always some visible, and they can see each other, these could relay information around the sun to other planets.


  • softwarepurist

    Author comment by softwarepurist · January 6, 2011 at 1:41 am

    Good thoughts, Lance. I agree with the idea that there would have to be other satellites to get around this problem.


  • Lapė Kalė · April 1, 2011 at 5:47 am

    If we really face the constraint of such latency, we could forget web as it is now. There would be no problem if web was static and everything coud be cached, but now most web content is dynamic. Imaginate your friend on mars updated their facebook status, and it will take ten minutes for you to see it. Forget about all modern web features, like ajax auto completers, real time chats, and any dynamic content. Ten minutes to wait may be way too long sometimes.

    Such latency is too much to cope with. We must to find a way to avoid it.


    • softwarepurist

      Author comment by softwarepurist · June 22, 2011 at 5:10 am

      Hi Lapė! Good points. It would definitely be a different kind of web, at least between planets. I agree, things like Ajax certainly wouldn’t work. It would be closer to web 1.0 concepts than web 2.0, unless the latency issue was somehow solved — or the dynamic content would have to operate against a cache, as you said. Thanks for your input! 🙂


Leave a Reply



Theme Design by