Are You Implementing Internet of Things with the Right Database?
The growing reliance on data to influence critical business decisions has enabled several organizations to rethink what they can achieve with the Internet of Things (IoT). If you think the growing world of the Internet of Things is just a fad, you may be betting on the wrong side.
A research study of Machine-to-Machine technologies states that by 2020 about 12.5 billion devices will have sensors that let them be controlled, monitored, and automated. These devices include machines such as electrical meters, air conditioners, and lighting control panels – all of which will be used to manage smart buildings and improve their energy use and operational performance. Harnessing the data generated from these connected devices and gleaning insights from this information leads to innovative business models, new revenue opportunities and compelling, accurate customer experiences.
Click on the image below to take a peek at what a cutting-edge smart building can look like.
Notice the sensors and actuators embedded all around the building? They churn out large volumes of data on room temperatures, humidity, and lighting to optimize energy consumption and avoid operational failures that have a real impact on the environment. In the retail industry for example, a pharmacy refrigerator failing to maintain proper cooling temperatures could place high-value medical inventory at risk.
Having all of these devices connected will only help you if you have the right data model, which poses the biggest challenge in helping you get the most of your IoT infrastructure. The data model must accommodate high-velocity sensor data and other considerations. Think of it this way: the hundreds of sensors and actuators generating massive volumes of unchangeable time-series data, only do so once. But the volume of data generated is vast; think Petabytes of information. To assimilate and analyze this information, database read / write performance is critical, particularly with high-velocity sensor data. Your database must support high-speed read and writes, and be continuously available (100% of the time) to gather this data at uniform intervals. In addition, you must plan for data scalability to maintain a cost-effective horizontal data store over time.
When it comes to the database for sensor data, IoT solution provider Riptide IO knows best. Riptide IO identifies opportunities to improve a building’s operational efficiency and reduce energy use through the implementation of interconnected sensors. Riptide IO knew from the start that relational databases simply wouldn’t work because of their lack of data availability and scalability. After a careful evaluation, the distributed database Apache Cassandra™ stood out because of superior read/write operations and the way it manages scale by simply adding nodes flexibly. This enables applications to scale easily with the ever growing volume of sensor data. In David Leimbrock’s, CTO of Riptide IO, own words,
“We were really impressed with the fact that no matter how hard we tried, we couldn’t break Cassandra. We would remove machines from the cluster and blow things away, but it was just indestructible.”
As devices increasingly become automated and connected, the volume, variety and velocity of data will continue to drive the demand for a scalable, always-on database that can handle these workloads. With Cassandra, Riptide IO helped one of the largest retailers in the U.S. to save millions of dollars by enabling smart buildings. Join us June 10 for a one-hour webinar to understand how Cassandra can help you achieve your IoT goals.