The Internet of Things (IoT) is the interconnection of uniquely identifiable embedded computing devices within the existing Internet infrastructure. Typically, IoT is expected to offer advanced connectivity of devices, systems, and services that goes beyond machine-to-machine communications (M2M) and covers a variety of protocols, domains, and applications. The interconnection of these embedded devices (including smart objects), is expected to usher in automation in nearly all fields, while also enabling advanced applications like a Smart Grid.
Things, in the IoT, can refer to a wide variety of devices such as heart monitoring implants, biochip transponders on farm animals, electric clams in coastal waters, automobiles with built-in sensors, or field operation devices that assist fire-fighters in search and rescue. Current market examples include smart thermostat systems and washer/dryers that utilize wifi for remote monitoring.
Besides the plethora of new application areas for Internet connected automation to expand into, IoT is also expected to generate large amounts of data from diverse locations that is aggregated and very high-velocity, thereby increasing the need to better index, store and process such data
As of 2014 the vision of the Internet of Things has evolved due to a convergence of multiple technologies, ranging from wireless communication to the Internet and from embedded systems to micro-electromechanical systems (MEMS). This means that the traditional fields of embedded systems, wireless sensor networks, control systems, automation (including home and building automation), and others, all have contributions to enable the Internet of Things (IoT).
The concept of a network of smart devices was discussed as early as 1982, with a modified Coke machine at Carnegie Mellon University becoming the first internet connected appliance, able to report its inventory and whether newly loaded drinks were cold. Mark Weiser’s seminal 1991 paper on ubiquitous computing, “The Computer of the 21st Century”, as well as academic venues such as UbiComp and PerCom produced the contemporary vision of IoT. In 1994 Reza Raji described the concept in IEEE Spectrum as “[moving] small packets of data to a large set of nodes, so as to integrate and automate everything from home appliances to entire factories”. However, only in 1999 did the field start gathering momentum. Bill Joy envisioned Device to Device (D2D) communication as part of his “Six Webs” framework, presented at the World Economic Forum at Davos in 1999. Kevin Ashton proposed the term “Internet of Things” in the same year.
The concept of the Internet of Things first became popular in 1999, through the Auto-ID Center at MIT and related market-analysis publications. Radio-frequency identification (RFID) was seen[by whom?] as a prerequisite for the Internet of Things in the early days[when?]. If all objects and people in daily life were equipped with identifiers, computers could manage and inventory them. Besides using RFID, the tagging of things may be achieved through such technologies as near field communication, barcodes, QR codes and digital watermarking.
In its original interpretation,[when?] one of the first consequences of implementing the Internet of Things by equipping all objects in the world with minuscule identifying devices or machine-readable identifiers would be to transform daily life in several positive[weasel words] ways. For instance, instant and ceaseless inventory control would become ubiquitous. A person’s ability to interact with objects could be altered remotely based on immediate or present needs, in accordance with existing end-user agreements. For example, such technology could grant motion-picture publishers much more control over end-user private devices by enforcing remotely copyright restrictions and digital restrictions management, so an ability to watch a movie of a customer who bought a Blu-ray disc becomes dependent on so called “copyright holder’s” decision, similarly to failed Circuit City’s DIVX.