In the past decade, the capability of computers has grown immensely. In the 1980s, Apple released its first computer, the mac, giving consumers a first look into the internet. At the same time, Microsoft was slowly building up its own personal computer. Soon afterward, windows became one of the most popular operating systems in the world, running on everything from the ATMs, industrial machines, and personal computers. Today, computers are able to control extraordinary things like driving cars, flying airplanes, and even simulating life itself. With all of this development, do we need to start rethinking of what a computer will look like for the average consumer?
As machines get more computing power, having each person owning their own computer is inefficient. Rather, it is more efficient and more convenient, as computers have more capacity to host multiple users, to have one single computer per household, with workstations – access points to the computer. Although some may shy away from sharing computers with others, these central computers can be divided with privacy in mind. This technology technically already exists in companies, schools, and government organizations. For instance, in Google’s Californian data centers, users data is stored there, with all processing done in the data center itself. To the user, the device used to access the account serves as an access point or terminal. The device wouldn’t need to have powerful processors or large memory chips. Instead, all they need is a device that can connect to the internet and deliver content quickly. This device’s computing power would be equivalent to today’s smartphones.
Numerous advancements in software and technology are beginning to make this possible. For instance, Google offered a program called Chrome Remote desktop. This application enables users to connect to their PC via smart phone, tablet, or other mobile platforms. Any work could be done on a mobile platform without hauling around a computer.
However, if this technology sounds so promising, why don’t people use this more often? There are two factors. First, our current cellular infrastructure isn’t capable of sustaining vast amounts of data. Second, data is currently expensive. Remote desktops demand severe amounts of data. Even if a consumer uses wifi, sometimes the data connection may be slow, preventing a steady stream of data. High speed internet, although fast, cannot sustain long term remote connections.
However, there is one signifigant drawback to this technology: energy. In order for the central computer with the user’s profile to be accessed at any time, the computer constantly needs to be on, even if the computer isn’t in use. Another potential concern for this technology is the risk of having ones data intercepted and information stolen. Although the obvious solution would be to encrypt data, consumer level encryption is becoming easier and easier to break.
As technology progresses, there are more innovative ways to use computers. In recent years, the data connection has went from 3G, to 5G, allowing us to stream HD videos on cellular networks. Using networks will open a new channel for technologies to develop, significantly helping consumers in their daily lives.