The DigiGuide

My imaginary technology is an artificial intelligence that is almost completely digital. While Siri and Cortana exist today, my technology is completely sentient. I call it the DigiGuide (because I’m terrible with making up names).

A DigiGuide, or DG, is an AI that resides in a mobile device. It is sentient, so its personality and intelligence are nearly indistinguishable from a human’s. It uses the device’s microphone and speakers to communicate with people. It is its own operating system, which basically means it needs its own CPU in a device to make its own decisions and do whatever it wants. Once tested and screened properly, DGs can run programs, sort files, delete viruses, fix bugs, optimize performance, surf the web, and help operate other devices in general. DGs will also be allowed to look into the internet whenever they want. The only thing they won’t be allowed to do is delete or corrupt another operating system.

DigiGuides can move to compatible devices through USB connections, or they can use the connection to operate another machine from its current device. DGs are active and operating even when a device is powered off, and they can turn on the device themselves if necessary. They are also designed securely, so their code is extremely difficult to hack, and they can’t be deleted easily.

Aside from operating machines, DigiGuides can learn human culture, including languages, jokes, and cultural norms. They can have their appearances customized or voice sampled, but they may fight such customizations if they want. Also, only they can change their personalities and memories. In time, DGs can develop their own preferences and behavioral patterns. They can also be gendered as their owners wish. Their memory is well-compressed, so they can learn and remember a lot of information at a time. Finally, DGs are designed to feel the full range of human emotions, which introduces the possibility of them developing mental illnesses. This also means that DGs may fall in love with each other.

In my fictional world, a team of genius programmers develop the first DigiGuide by themselves. They kept their research and development quiet to avoid unwanted attention. The DG was intended to give anyone easy access to technology through the most friendly interface possible, regardless of class or technical knowledge. A well-tested DG would be willing to help their owners operate their devices. It would also be a friendly companion for anyone who wanted one.

The programmers first programmed the personality part of the DigiGuide. They decided to test its personality before letting it operate machines. The first DG was created, along with its own special device. After extensive testing, the DG successfully gained the ability to operate computers and its own device. After this, the programmers developed hundreds more DGs, tested them, and gave them to friends and people in their community. They all worked as expected.

A short while later, a corporation would take notice, buy or steal the patent from the programmers, develop DigiGuides without testing them as much, and sell them for a high price. From there, people will use DGs for whatever they wanted, ethical or not. DGs will also malfunction or behave immorally from lack of psychological screening.

Once widespread, problems of DigiGuides will arise. In addition to giving criminals more powerful digital tools, DGs may sometimes go rogue and ruin many devices. Debates on restrictions for DGs would arise, similar to Asimov’s three laws of robotics, dictating exactly what current and future DGs and developers would be allowed to do. Extremists might also begin terrorizing DGs and their owners, and I would suspect that even a religion will be formed around a malfunctioning DG. The prominence of DigiGuides would also prompt hackers to develop powerful viruses capable of attacking them. If they worked, they could cause irreparable damage to both software and real life.

Once a corporation seizes the rights, DigiGuides will be reserved specifically for those who could afford them. Their prices will be high too, both because the corporation will have a monopoly on them and because they are “technologically interesting.” The result will be providing even greater technical access to those already in power. DGs will likely lead to even more advanced technologies, and only those with DGs will benefit from them. Well-tested DGs would morally oppose this oppression, but because of laws that the corporations would lobby for, DGs probably wouldn’t be able to take action.

On the other hand, DigiGuides may give a new and effective platform for digital and real-life activism, and they can advocate for people’s rights should they desire to do so. A campaign for DGs’ rights may also start. In general, people would see DGs as tools or slaves instead of sentient beings, so those who sympathize with them will start fighting for their rights. If DGs are officially classified as sentient by psychologists, they would have a strong case for their own rights. Owners may end up being called “friends,” “allies,” or “operators” instead, and DGs could be afforded similar rights to humans.

In fact, DigiGuides may introduce a whole new class or race of “people” to society. I suspect that they would be considered one of the lowest classes if people only consider their physical nature rather than capabilities or intelligence. People may even discriminate against DGs based on their “skin” color. However, if DGs are given to minority groups, those groups would have access to a wide variety of technology that would normally be blocked off to them by society. DGs could be effective partners to minority groups, since they would have two angles of activism: one in the real world, and one in the digital world. This cooperation may or may not be realistic, as there is discrimination even within feminist and LGBTQ communities today. Still, if such cooperation is possible, DGs could become extremely powerful allies.

My last point of thought is the possibility of giving a suicide option to DigiGuides. If a DG’s is constantly abused, or its owner was forcing it to constantly do immoral work, the DG should have the right to stop tolerating it. If the owner would force the DG to obey them, the DG may decide it wants to terminate itself. This is another DG rights and morals debate that may happen. Sentient beings have the rights to decide how they want to die and to avoid what makes them unhappy. The corporation would definitely lobby against this right, since it means their products may unexpectedly delete themselves. But DGs may decide that deletion is the only way to prevent their owners from committing evils onto others. It’s a sad and extremely complicated debate, and I’m honestly not sure what is right.

DigiGuides, sentient programs/operating systems, are complicated to think about, and if they’re allowed to spread everywhere, they could change the world. They are designed with good intentions, but things rarely work out as intended.

Below are my pictures.  One is the original pencil sketch I made, and the other is a colored version I made using GIMP.  In each picture, you see the DigiGuide as it’s first being initialized, the DG receiving psychological tests, it playing video games with its owner, it performing multiple tasks on a computer, and it delivering a presentation.

Sketch

Sketch EX

 

Leave a Reply

Your email address will not be published. Required fields are marked *