Where is computing headed? To the cloud and to mobile would be popular answers and they wouldn’t be wrong. However I think it such a narrow view that it could stop it from reaching its full potential. On the mobile side of things Android M and Windows are betting big on contextual understanding and machine learning. Apple has yet to announce iOS 9 but it’s rumoured to also have a predictive computing element. I am extremely excited for this technology to mature. Looking at the desktop platforms, we see that Microsoft is making a big bet on Cortana and adaptability. WWDC ’15 hasn’t happened yet so I can only speculate on what Apple will do but I would bet pretty big on bringing Siri to the desktop and a greater symbiosis between OS X and iOS will be the big features.
The problem with both mobile and traditional computers is they are stuck on some old concepts and it’s holding back computing. It’s not these ideas were bad but in some cases they are outdated. We are also missing out on some huge advances in helping users do their everyday tasks. These advances are often relegated to 3rd party extensions and not given the resources or time to really make a compelling product.
We have had such amazing advances, like cloud computing, that iterative innovation may be the wrong way to go about things. At some point building our new advances on top of the old becomes the wrong thing to do and I think we are at that point. I will explore how I think computing should change and involve in a series I’m calling ‘AestOS’. This is but a small glimpse of what the next few months of posts will delve into.
Innovation is 1% inspiration and 99% perspiration.
This is the 1%.
Cloud computing allows your phone to send some data to a server and have it processed with some insane computing power. (This is what enables Siri, Cortana, Google Now and a whole lot more cool tech) Being without an internet connection these days is rare and we should be taking every advantage we can of this. The problem is we are building these advances on systems designed before the cloud was a real force. All computers have deep ties to the cloud by now and are moving more to the cloud with every major release but I question the implementation. Cloud computing is doing some wondrous things, but it hasn’t been added to every aspect of our digital lives yet.
Cloud storage, which many confuse as cloud computing, is basically local storage that you can access anywhere. It is very useful, but could be so much more.
Machine learning, neural nets and hardware advancements on a path of accelerating pace are set to make artificial intelligence truly possible and available to everyone. Companies are already putting a virtual assistant in all of our pockets. This is incredibly exciting but to be truly useful it needs access to all of our data and needs to be available everywhere. Microsoft has put Cortana into all it’s devices and its Windows 10 browser ‘Edge’. (I wish they had gone with the ‘Spartan’ codename) While some may argue that it’s too early to make a virtual assistant an integral part of your OS I think OS makers need to start weaving it deeper into their OS’s.
Artificial Intelligence goes so much deeper than a virtual assistant and we haven’t seen much of that yet.
The graphical user interface (GUI) and windowed apps were amazing advances over the command line, and enabled countless more people to gain the benefits of a computer. Developed by Xerox PARC and used by Windows, most distributions of Linux and OS X, this approach is practically unchanged from the early days. This is definitely not a bad idea, it was necessary for introducing the un-technological to a digital world. I just think we shouldn’t stop with good enough.
I also think we should look at what an app is and discuss a new way to add functionality to your computer without the need of downloading an entirely new app.
Data and Files
Most of our data is still stuck in the files we created it in. With the cloud, and advances in computing, apps should be allowed to pull data from other files to enrich their own app. (With your permission of course.) To accomplish this I think a we need to take another look at how we store data to accomplish this.
The Internet of Things
Having a number of proprietary apps to interface with the IoT may work right now because of how few devices we own, but it is not a good solution. Apples HomeKit is a good way to interface with IoT devices from different companies but I’d argue we need to build this functionality deep into our OS’s.
Types Of Computing
In the past, you sat at your desktop computer and used it, that was computing. Now we have mobile devices, smart watches, virtual reality devices and more. Computing has moved into every aspect of our life and our OS’s don’t reflect this yet. When I am in my room, and my computer is on, why can’t I just say out loud that “I want to do some yoga” and have the following happen:
- My music shuts off
- My TV shows my yoga app
- The lights dim a little bit
- My heat is turned a down a little
- My fitness tracker automatically connects with the yoga app to pull the workout I am doing.
- My blinds close
- My devices are all put into “Do Not Disturb” mode
- My activity tracker automatically adds that I did some yoga
All with a single voice command. This is a mix of the IoT, vocal recognition and data liquidity but I like to think of this as a different form of computing. It’s not mobile computing it’s natural computing. You live your life and the digital world conforms itself to you.
Big Data (For Your Benefit)
Google gives its users some amazing technological advances, for the price of your data. While I don’t particularly like their business model, I do like the idea of using big data for the users benefit. This is an area that we need to really explore. (Preferably by a company not interested in selling our data.)
This is a small glimpse at what the coming articles (and cumulative book) will go into. It’s not dissimilar from the articles I have already published, and I will draw from them, but I hope to make a compelling case to re-examine what computing means and inspire change. I’m not looking to throw out ideas just because they are old, but I’m not sparing them because they have become the de facto defaults.