18 - Indefinite life, Driving your car with a number, generating photos of anyone doing anything
Indefinite life, Driving your car with a number, generating photos of anyone doing anything
My recent thoughts in <400 words
Driving your car with a number
- In case you didn't know, all software consists of binary numbers. Everything on your computer or phone - including these words of mine that you are reading - exists as a string of 1s and 0s. This includes all of the images that you see, every application that you use, every video that you watch.
- One cool implication of this is that every piece of software exists, ultimately, as one long number.
- This means that any machine learning model that I train (such as a recent mini-project that identifies mushrooms) is long one number.
- Stepping back further, this means that one long number can drive a car. Tesla's self-driving software is ultimately one long number. The number is written in such a way that it can recognise street signs, dodge pedestrians, and change lanes on highways.
Generating photos of anyone doing anything
- Very shortly (perhaps already), we will be able to generate entirely realistic images of anything.
- For example, this website (thispersondoesnotexist.com) contains lots of images of people.
- All of the images are fake. The site produces the images using a machine learning technique called a GAN (Generative Adversarial Networks).
- Most of the images are very realistic, despite this implementation being old (2019). Here is another example using the same technique called thiscatdoesnotexist.com. Here is an older example where a model produced fake Barack Obama footage. If you didn't know these were fake, would you be able to tell?
- So, machine learning will mean that people can generate videos and images of anyone doing anything. What will be the effect of being able to generate realistic, compromising images of anyone?
- This will probably cause initial turmoil. Reputations are susceptible. Any person with some GAN skill will be able to create videos of another person doing reputationally damaging things.
- But knowing that anyone can generate any image might create a golden age for privacy. After an initial wave of fake news, people will start to doubt the truth of any media. This means that everyone will receive more privacy.
- People will doubt that a picture of someone doing something outrageous is truthful when they know that they could generate the same photo at home. Any real photos of people doing university drinking challenges in the street while wearing speedos will be obscured by the ability to make fakes.
- So, images will start to fail as a historical record of truth. As someone said online, it would be "an antidote to moral outrage if nobody could believe in any picture anymore”1
Indefinite life
- Everyone wants to live for as long as possible.2 On my part, I would be happy with a lifespan of maybe 6000 years, like Legolas.
- While human lifespan seems to be hitting a threshold with current medicine, it seems inevitable that we will live indefinitely eventually.
- While the human body is one of the most complex machine ever known, it is just made out of matter. So modifying any aspect of it - such as ageing - is an engineering problem (albeit hard).
Problems of not ageing
- However, solving ageing would lead to many other problems. What would happen in a society where each person could live for 300 years?
- One economic problem would be that wealth would be concentrated to an extreme. Elon Musk's current wealth is around $230B. (He might be the first trillionaire, given that he is in his 50 now).
- If Musk lived 4 times as long, he might be individually wealthier than most countries. While Musk creates huge value, what would happen when someone passes down a trillion dollars to their beloved, but unwise, children?
Outro
- I am enjoying Bonn and exploring the Rhineland. Here is a photo of our local car ferry across the Rhine.
- I hope that you are having a good week :)
https://news.ycombinator.com/item?id=28833213
…and in good health.