Mary Meeker’s Internet Trends 2014 Report, unveiled at re/code’s Code Conference last week, highlighted the “troves of findable and shareable data” from mobile devices and sensors as a crucial trend for marketers and, really, all businesses to address in the coming years. Data sensors, specifically, have become a major area of innovation for tech entrepreneurs looking to produce inventions that provide meaningful real-time analytics to company members and consumers. But how are companies creating new apparatuses to collect data with? Meeker’s use of “findable” and “shareable” are the key to unlocking the often lost or muddled value of data.
Samsung’s Simband, for example, was developed strictly as a sensory system. The tech giant wanted to create a basic system (and to probably make it a standard) for collecting personal and environmental data so that other developers wouldn’t have to make their own device in order to realize their own data collecting model. Apps are likely already in development for Simband in a vein similar to Google Glass.
For those of you unclear on what a sensory system is, “A sensory system is a part of the nervous system responsible for processing sensory information.”
“Commonly recognized sensory systems are those for vision, hearing, somatic sensation (touch), taste and olfaction (smell). Receptive fields have been identified for the visual system, auditory system and somatosensory system, so far” (Science Daily). In tech terms, this means giving machines the same methods of collecting sensory details that we humans do.
All third-party applications utilizing Simband will require an operations model that caters to the device by making sense of the information derived from its data collecting capabilities. By making a base device sans apps, Samsung has done what many other smart tech providers are looking to do: offer a system of sensors (usually wearable tech) that provides a wealth of opportunity for app developers without the device manufacturer having to get involved beyond the usual rights and royalties.
To extend the Google Glass reference, it’s another example of a device where humans give a machine sensors to intake details like images and sound in order to reap more data like location, input command patterns and other analytics KPIs from the user. The apps and analytics models for Glass need to go beyond the collections methods and offer ways of repurposing the data into meaningful insight, behavioral patterns of users and more.
Even data storage systems themselves will rely on their own sets of sensors in order to run better in the future. Ironically enough, data sensors themselves will play an integral role in facilitating data centers collecting from other devices, algorithms and product models. Using sensors, issues like overheating databases, short circuits and electrostatic discharge could become a thing of the past – crucial when you think about the staggering amount of data that’s being moved to virtual server centers and the cloud every day. According to Cisco Systems, global data center traffic will grow three times, reaching an annual input of 7.7 zettabyte (7.7 billion terabytes) by 2017.
Data has become a big buzz word in emerging tech, but rarely do we use it to describe the overarching process of collecting data and making it “findable” or “shareable”, to use Meeker’s words. The latest developments in data sensors are just a small part of the big data equation, as understanding what to do with the data once it’s collected is what truly provides value to companies and consumers.
As we like to think beyond the press release, the same goes for traditional definitions of data. We want to take our clients through a data journey that starts at collection and ends with rich insight that leads to better decision making. In an upcoming post, we’ll be exploring whether technology is improving or hindering our relationships with brands. Stay tuned for that and don’t hesitate to get in touch with us at email@example.com.