Internet of Things (IoT)

The Internet of Things, or IoT, refers to the growing number of devices connected to the internet – from mobile phones to thermostats, temperature sensors to GPS trackers. More specifically, it refers to the network these devices create and the abundance of data sent over this network. The power of IoT lies in this data and the combination of IoT and cloud computing presents an opportunity to transform a number of industries by making them smarter. Smart Energy systems allow devices in a power network to be monitored, flagging excessive usage and allowing for efficient distribution of load. Smart Cities are using IoT devices to monitor traffic, pedestrian movement and environmental conditions to improve planning, assign resources and aid emergency services. Smart Manufacturing allows for an entire plant to be monitored to improve throughput, identify bottlenecks and optimize the entire production line. DataInsighter specializes in providing a platform to take all of this IoT data and put it to work through collection, cleansing, analysis and access to the data and insights wherever they are needed.

Why IoT?

The traditional approach to monitoring and control of a system is through proprietary devices. These devices are generally heavyweight, designed for predefined installations that may not meet the requirements of the network. Their communication and data protocols are also proprietary and inflexible, making it difficult to get the most out of what is being measured or controlled and limiting the ability to change the network over time by adding devices or changing device vendors. IoT solves these problems by providing for both scalability and flexibility. Scalability is a result of the fact that an IoT device simply needs to connect to the internet and transmit data. Any device that can take a measurement and transmit to an appropriate web-based backend can easily extend the network. In the same manner, a change in client device manufacturer is invisible to the IoT platform, so long as it is sending the data with the appropriate web call, meaning an extra level of flexibility. This means that the number of IoT devices in a system is dynamic – no need to anticipate all future needs, just implement what is immediately required and then scale up or down as requirements change in the future by adding whatever device makes sense. From monitoring the power usage of a residential home or business, all the way up to a city full of traffic and pedestrians, IoT provides extra freedom to adapt over time.

IoT Analytics

The real power of an IoT solution lies in the data and taking advantage a full suite of data storage and analytics provides a unique opportunity for Business Intelligence as well as advanced analytics through the use of machine learning and statistical modeling. Once an IoT system is instrumented and reporting its state, the entire network can be modeled and visualized. Broadly, there are two types of analytics to consider: real-time and latent (post processing).

Live Monitoring and Alerting

Real-time analytics allow for constant updating of KPIs and system health measures for live monitoring and alerting. Cloud-based IoT platforms are designed to ingest millions of events per second, correlate across related data sources and push information to applications where it is needed, in near real-time. Single devices do not need to be monitored in isolation, and their data can be combined centrally to give a more informed overview of an entire network. Dashboards and applications can be built, with smart, data-driven indicators when there is a problem at any point in the system.

Post-Processing for Structure and Historical Analysis

Latent analytics, or post-processing, represent a unique opportunity to analyze data beyond the obvious descriptive power. As events are received in real-time, they are forwarded to appropriate reporting and alerting mechanisms, before being stored for later processing. Using big data technologies such as Hadoop, processing jobs are run on any necessary data, scaling elastically to parallelize the processing and process any amount of data. This is a mixture of scheduled and adhoc jobs, with scheduled jobs producing processed (cooked) data that can be stored in traditional data warehouses for faster reporting and analysis.

Data-Driven Decisions

By providing a data-driven view of the entire process, all decisions can be made using data – no more ‘guessing’ required. This in itself is huge, but the benefits do not stop there.

PREDICTIVE MAINTENANCE

Using Machine Learning, also known as Predictive Modeling, problems in the system can be detected before they happen, leading to predictive maintenance. Since each piece of equipment is fully instrumented, the signs of an issue occurring can be found in the data being produced. Models may be prototyped and experimented with before being deployed in the production system to report predictions and track problems before they occur.

OPTIMIZATION

Instrumenting the process allows the entire system to be optimized through data. Experimenting with tweaks to single components can have their impact fully measured across the whole deployment. The data output from any component can be fed back into the system, providing a rich feedback loop that goes beyond a traditional setup.

COMBINING DATA SOURCES

Instrumentation and measurement need not be limited to the IoT network itself – this data can be integrated with other organizational data sources to boost efficiency through holistic insights. As an example, supply chain data in a Smart Manufacturing setup can form part of the control feedback loop, potentially throttling production capacity in reaction to supply shortage. Moving further along, Customer Relationship Management (CRM) data may also be fully integrated with Sales, Marketing and Social data. Sales teams can view manufacturing output data while on the road and report directly to customers. This full suite of data can be leveraged for improved resource management and planning.
Overall, the combination of IoT devices and cloud computing matches the capability of existing systems, while providing extra flexibility and lower costs. Beyond current capabilities, an IoT solution allows for a data-driven view of the entire system, implementation of new, powerful control algorithms and the ability to apply advanced data analysis and statistical modeling techniques on a centralized repository of data.

INTEGRATION SERVICES

Most organizations have existing business systems and associated data that were never designed to work together. However, this functionality – and, perhaps more importantly, this data – is far more useful and powerful when it may be managed and used collectively, wherever and whenever it is needed. Additionally, the prevalence of mobile devices and telecommuters has led not just to a request for common access, but even to an expectation of availability of company services and data regardless of location.

Solving this problem is challenging, both from an architectural and development perspective. The best approach is to follow a set of established industry patterns, proven to work in a number of different situations. At DataInsighter, we have experience with many of these different patterns and can ensure that data (and functionality) is available where it is needed.

WEB APIS

A fairly recent concept, still lacking a formal definition, the use of Web APIs allows an organization to expose their data and systems internally and externally using lightweight, broadly-supported web calls. This approach allows for minimal processing requirements on the client-side, meaning that mobile browsers and smartphone apps can easily take advantage. It also provides the ability to expose services in a common way to internal or external clients and monitor and govern their usage through API management.

CLOUD-BASED APIS

The use of Web APIs also allows the ability to expose systems via the cloud, pushing system access control off premises. This is a perfect use-case for the cloud, allowing the use of the API to scale effortlessly and simple integration with systems as they are also moved to the cloud. Spikes in API usage are handled smoothly, minimizing potential outages, while the rich tooling and support from cloud providers allows for rigorous API governance.

TRADITIONAL APPROACHES

Even with the prevalence of Web APIs, they are not always appropriate and can only be built on top of enterprise systems that are ready to support the pattern. In many cases, the use of more traditional service-oriented architecture (SOA) integration is necessary, along with writing custom adapters for new and existing systems to communicate directly to each other. Using the latest techniques has a lot of advantages, but it is critical to find the right approach regardless of the particular technology used to achieve it.
Our engineers are familiar with all of the complexities an integration project involves and work to find the right solution that strikes the perfect balance of custom work, extensibility, performance and security. We work to build the right solution and make data and functionality available wherever it is needed.

TECHNOLOGY

DataInsighter engineers and data scientists employ a wide range of tools and technologies, adapting to the needs of each data challenge.

IN THE CLOUD

Cloud computing offers the ability to scale storage and computation based on necessity rather than being limited by an organization’s computing capacity. This means that data can be acquired and stored indefinitely using services such as Amazon AWS and EC2 as well as SQL and Windows Azure. At DataInsighter, we make full use of the latest NoSQL technologies by running Hadoop jobs with Amazon Elastic Map Reduce or Microsoft HDInsight.

ON PREMISES

We understand that not all data is suitable for deployment to a public cloud service. Existing infrastructures may not be easy to migrate and some data may have security or privacy requirements excluding it from a public service. In these instances we employ the Microsoft BI stack for on-premises data management and analysis including SQL Server Analysis and Reporting Services with integration into Microsoft SharePoint.

MACHINE LEARNING AND RAPID PROTOTYPING

Data at scale is inherently difficult to analyze. To get the most out of data it is important to use state-of-the-art statistical packages such as R, Python and Matlab. We sample huge collections of data to gather a representative set that is much smaller and easier to handle. We then rapidly prototype different analytical approaches including trained and untrained machine learning techniques. By using this fast, iterative process, we can quickly draw conclusions about effective models and then plan a scale-out of the most effective Big Data solution.

VISUALIZATIONS

Often the biggest impact data can have is through visualizations. After the data has been stored, cleaned and analyzed, we can distill the most important parts into Microsoft Excel spreadsheets where we will create appropriate tables, charts and graphics to make the insights obvious and useful. We have experience sharing and hosting these through Microsoft SharePoint and PowerPoint and embedding graphics into web pages.

CUSTOM SOFTWARE AND SYSTEMS

With our years of combined experience in software development, we can identify pieces of the system that may be automated or scaled out using custom .NET, Java or C++ code. Whether it is a sensor producing isolated data that should be consolidated, or a robust web service to host the data and insights produced during analysis, we can build a tailored solution that adds to the overall value of the system.