Looking at the vast ocean that is modern-day computing, we can see that major developments come in waves. The arrival of mainframe computers in the 1960s generated the first wave (one computer for many people), followed in the late 1970s by personal computers in the second wave (one computer for one person). In 1988, Mark Weiser presciently observed that computers embedded into everyday objects, objects all around us, were forming the third wave—what he called ubiquitous computing (many computers for one person). A decade later, in 1999, Keven Ashton put forth the ideas behind, and coined the term for, the fourth wave: the Internet of Things.
In this paradigm shift, Weiser's computer-embedded everyday objects—or "things"—are connected to the Internet and can communicate with users and with other devices. The guiding principle is connection, along with the conviction that if something can be connected, it will be connected. Indeed, in recent years, the wave appears to be rising to a crest. The plunging cost and size of processors and chipsets, the massive expansion of the IP address space, and the growing coverage of broadband networks allow virtually any object to be connected to the Internet. The computers, laptops, tablets, and smartphones that constitute the bulk of the Internet of Things (IoT) today are being joined by smartwatches, smart appliances, cars, lightbulbs, and an array of other devices that collect and transfer data, often without any human involvement. As that data is increasing and the technologies are advancing, we are moving from the early IoT of smart connections to a new phase, one of invisible integration.