Your data. Your choice.

If you select «Essential cookies only», we’ll use cookies and similar technologies to collect information about your device and how you use our website. We need this information to allow you to log in securely and use basic functions such as the shopping cart.

By accepting all cookies, you’re allowing us to use this data to show you personalised offers, improve our website, and display targeted adverts on our website and on other websites or apps. Some data may also be shared with third parties and advertising partners as part of this process.

Background information

The Valerian bill: Lars' proposal

Dominik Bärlocher
25.7.2017
Translation: machine translated

Last week I calculated in an article how big Wikipedia will be in the year 2700. In the article, I asked you to do the maths better if you can. Reader Lars Pautsch has a suggestion as to how big Wikipedia will be in the time of Valerian.

My question: How big will Wikipedia be in the year 2700?

The reason I'm asking this is the film "Valerian and the City of a Thousand Planets", in which the database of the Alpha space station plays a central role. The database contains the collective knowledge of over 3300 planets. I have therefore calculated the average growth of Wikipedia, which is then calculated linearly up to the year 2700 and then multiplied by 3300. I already realised that the calculation was inaccurate when I did the maths.

Lars writes: "I thought you could just misappropriate the compound interest calculation for this and do the calculation for the exponential curve. I'm not quite sure if that really fits, but it was worth a try."

The 38-year-old designer also explained the calculation himself and attached it to the email in an Excel sheet.

He took the basic data from my article:

  • A wiki article is on average 6.36 MB (exactly: 6.35985333174044 MB) in size. With pictures and all edits and everything.
  • Wikipedia went live on 1 January 2002
  • Valerian's adventure takes place in the 28th century. In our example, we take the year 2700.

Lars does the maths

His explanation: "I have used the compound interest formula to ensure that the number of items does not increase at the same rate but that the new items from the year are also added each year, resulting in an exponential increase in the number of items: Ke=Ka(1+p/100)^n"

"This multiplies the final capital (in this case our final article quantity or ultimately the final data quantity) by an amount according to the formula from the annual interest rate p (in this case this is the article growth by which the total article quantity increases each year) and this times the number of years of the term (here 2700-2002 = 698 years term)."

But first he calculated the interest rate. It works like this:

Articles per year / total number of articles * 100

or

302269.4444 / 5440850 *100 = 5.55555554739

From this he draws the following calculation:

Sought final article quantity in the year 2700 = total article quantity 2002 (here 5440850 articles) * (1+article growth as interest rate (here 5.55555554739)/100) raised to the number of years term (here 698)

or

x = 5440850 * (1+5.55555554739/100)^698

It's clear from the ship that this is going to be a crazy high number. The number of articles is still manageable, which according to Lars is 133 497 045 298 808 000 000 000. Unless your editor is mistaken, that's around 133 sextillion articles.

From here on, it's easier. Because an article is 6.36 MB in size. So Lars has simply multiplied the 133 sextillions by 6.36.

133497045298808000000000*6.35985333174044 = 8.49022E+23

Good, this is where it gets hairy. The result is even more incomprehensible than the 133 sextillions. But Lars has gone to the trouble of breaking it down.

"If you count the zeros: According to abacus-friedrich, 1x10²³ is 100 trillions.
Here the conversion is described: And if you then add 3 more zeros to the 18 zeros of exabyte you are at zettabyte with 21 zeros and we have 23. therefore 849 zettabyte... plus / minus."

Then multiply that by 3300.

8.49022E+23 * 3300 = 2.8017726e+27

or, a little more human-friendly

849 * 3300 = 2801700

That's 2,801,700 zettabytes. And this is where it gets really hairy. That's about 2801.7 yottabytes. Incidentally, contrary to rumours, the unit is not named after the character Yoda from Star Wars. Or even 2.8 xenotabytes.

Wikipedia has an interesting little fact about the yottabyte.

Why are you doing this?

All calculations aside, the question is: Lars, why are you doing this? "I'm a fan of Luc Besson films. I think he makes the best films ever alongside Ridley Scott. My favourite film of his is "The Fifth Element" with Milla Jovovich," he writes.

Milla Jovovich as Leeloo in «The Fifth Element»

That's enough for him to work out how many yottabytes or xenottabytes a database will have.

However, Lars admits that the calculation is probably not correct. "It's just a bit of fun maths," he says. He's not entirely sure whether the calculation works at all and whether the result is in any way realistic.

So you can do better? Work out how big Wikipedia is in the year 2700, curve and all, and send me an email. Because I think we can work it out. Since Lars won the cinema ticket, we would have run out of prizes. We would have. Because at Pathé Films, the film distributors responsible have searched through their cupboards and drawers and found another cup that changes colour. Whoever dares to pick up the bill, owns it.

Finally, this: Incidentally, the trailer for "Valerian and the City of a Thousand Planets" features a cover version of the song "Gangsta's Paradise" by Coolio. Wouldn't you believe it?

Here's the trailer:

and here's the song without sound effects and everything:

You might also be interested in this

28 people like this article


User Avatar
User Avatar

Journalist. Author. Hacker. A storyteller searching for boundaries, secrets and taboos – putting the world to paper. Not because I can but because I can’t not.


Background information

Interesting facts about products, behind-the-scenes looks at manufacturers and deep-dives on interesting people.

Show all