Monetisation and Data Priority

Digital and communications service providers have been in the “human-to-human” business for a long time. In the old days, their offerings were focused on circuit-switched voice and messaging communications. However, in recent years, there’s been a clear shift toward data centricity developing in the operator earning model.

On the consumer front, smartphones and app-driven consumption have set the pace of this transition, but there’s a lot more due to the ever-increasing number of connected devices and sensors throughout the world, and the data volumes they produce are heavily impacting operators’ networks and bandwidth. Consequently, there is a very credible risk that network connectivity and bandwidth will become a serious bottleneck for M2M communications.

To manage the scarce resources of the data pipeline in the era of the IoT, policy and charging control must step in to help maintain a high level of QoS. This is extremely significant, and will be especially important when time-critical use cases like video streaming and voice calls are competing for space in the same data highways as the bulky, sometimes slow and vastly expansive sensory data of the IoT.

The new monetisation model for operators is designed on the principle of priority, which depends on a user paying for more bandwidth. High data bandwidth is, therefore, a source of income similar to the business class of an airline, relying on users to pay for a higher QoS.

With data monetisation, all potential business models are likely to exist in parallel, as “more impatient” data comes with a higher price tag than bulk carriage data, just like in logistics and “snail mail” systems today. Getting a 20-foot container full of goods from China to Europe can be both costly and fast (think air cargo) or less expensive but slow (think container ship), and regular transfers can be built on subscription models, whereas ad hoc transfers can be conducted on a pay-as-you-go model. As the old fundaments are so well established and understood, it is likely that data transmission will follow similar logic, at least in the beginning.

One thing is clear: the ability to rapidly change monetisation logic is of the essence in order to make this work due to uncertainties and the rapid evolution of business models and needs.

At the Edge of the Network: Refining the IoT Data

Computing needs to be fluid between the data source, client and the cloud to optimise data flow through the scarce resource of connectivity. As data amounts surge within the IoT, turning information into actions should happen closer to the source, whereby the intelligent mediation software has direct access to the crude data and full control over data flows.

The IoT, therefore, necessitates an all-encompassing mediation layer, which rapidly adapts to varying data sources, with CEP for both structured and unstructured data. To extract true value and actions, the mediation layer needs to go beyond data integration, not just streaming it but also embedding real-time intelligence with analysis, refinement, reporting and monitoring.

In the past, all data was collected in one place, and then processed and analysed in multiple separate phases. That was all well and good, but it introduced a fair amount of latency into the process, often moving data around unnecessarily. This is no longer a sustainable model, as data volumes and sources grow to satisfy faster business cycles. Data refinery needs to become more fluid in terms of where data is processed, analysed and acted upon. Therefore, the analysing and refinement of data into valuable actions needs to happen in real time and in the same uninterrupted data flow.

An IDC report predicts that, by 2018, 40 percent of IoT-created data will be stored, processed, analysed and acted upon close to, or at the edge of, the network. The systems which process data must be capable of dynamic re-allocation of tasks and activities along the network paths between the sensors and their compute capacity, the network and the cloud.

This creates an interesting opportunity for digital and communications service providers that tend to have data centres somewhere in the middle of the above model. If operators had IaaS and PaaS capabilities, they could provide the IoT system developers a means to optimise data flows in more innovative ways. Ultimately, this allows direct triggering nearer to the data sources for activities that require true, real-time action, as well as the aggregation and movement of larger data sets in less congested times, off-loading the data pipe crunch to generate additional business benefits.

Click here to read part 2.

Connect with Linkedin to Comment

Don't worry. It's stupid easy.