Top IT Trends for 2018: Part 1

Author: Hu Yoshida,Chief Technology Officer, Hitachi Vantara

2017 was a watershed year for digital transformation. It wasn’t a year of major technology breakthroughs, but it was a year in which many of us began to change the way we use technology. Cloud adoption increased and more applications were being developed for it. Increasingly, corporate executives were more committed to and investing in digital transformation projects; early indications are that we have stopped the decline in productivity and are on an upturn.

For my 2018 IT trend predictions, I’ve decided to focus more on the operational changes I believe will affect IT, rather than changes in technologies like flash drives. I will be posting my predictions under the following groupings:
o   Preparing IT for IoT
o   IT must do more than store and protect data
o   Get ready for new Data types
o   Methodologies for IT Innovation

These are my own prognostications and should not be considered as representing Hitachi’s opinion.

Preparing IT for IoT

Prediction 1: IT will adopt IoT platforms to facilitate the application of IoT solutions.  

IoT solutions deliver valuable insight to support digital transformation and are rapidly becoming a strategic imperative in almost every industry and market sector. Cloud, analytics and IoT will greatly enhance OT dominated industries, and enable greater efficiency, security, intelligence and profitability for the enterprise. Unfortunately, most IT organization have very little knowledge or experience with OT systems like (SCADA) supervisory, control and data acquisition systems which were designed to provide high-level process supervisory management of other peripheral devices such as programmable logic controllers and discrete PID controllers that interface to the process plant or machinery. Conversely OT systems have very little knowledge or practice with IT methodologies like cloud, containers, micro services, security, distributed storage and analytics.

Building IoT solutions that provide real value can be difficult without the right underlying architecture and a deep understanding of the OT business to properly simulate and digitalize operational entities and processes.

This is where the choice of an IoT platform and the choice of an experienced services provider is important. Enterprises should look for an IoT platform that offers an open, flexible architecture that simplifies integration with complimentary technologies and provides an extensible “foundry” on which to build a variety of industry applications that companies need to design, build, test, and deploy quickly and with minimal hassle.

With IoT platforms, OT systems can now report state in near real-time and use the scalability available in cloud environments to implement more complex control algorithms that are not practical to implement on traditional programmable logic controllers. The use of open network protocols such as TLS, inherent in the Internet of things technology, provides a more manageable security boundary than previous OT systems. OT systems were designed to be open, robust and easily operated and maintained but not necessarily secure. Security becomes much more important as critical OT infrastructures come under attack.

The decentralization of data also frees up the data that is locked up in traditional PLC memory addresses and makes it available for data modelling techniques like asset avatars which are virtual representations of each device in software. These avatars can also include other pertinent information like web based information and database entries that may be used for other facets of the IoT implementation. 

IT will be challenged to acquire the skills, to build this platform if it has to be developed from scratch. Utilizing a purpose-built IoT platform like Hitachi’s Lumada, will speed up time to value and free up IT teams to focus more on the final business outcomes. Depending on the complexity of the project, it might be implemented as an appliance, or could be a distributed platform from edge to gateway to core to cloud. Evaluate the available of IoT platforms from experienced vendors before you commit to the time and resources to build your own.

Prediction 2: Movement to the next level of virtualization with containers.

IoT applications are designed to run in a cloud like environment for scalability and agility. Container-based virtualization is designed for the cloud and will gain wide acceptance in 2018.

Containerization is an OS –level virtualization method for deploying and running distributed applications on a single bare metal or VM operating system host. It is the next generation up from virtual machines (VM)s where the traditional virtual machine abstracts an entire device including the OS, containers only consist of the application and all its dependencies that the application needs. This makes it very easy to develop and deploy applications. Monolithic applications can be written as micro services and run in containers, for greater agility, scale, and reliability.

Everything in Google runs in containers from Gmail to YouTube. Each week they spin up over two billion containers. Almost every public cloud platform is using containers. The level of agility and scalability we see in the public cloud will be required in all enterprise applications, if we hope to meet the challenges of exploding data and information in the age of IoT.

Hitachi has adopted containers in all their new software platforms and is rapidly converting key legacy platforms over to containers, to not only realize the benefits of containers in our own operations, but to facilitate the use of containers by our customers. Our Lumada IoT platform is built on containers and micro services to ensure that it can scale and be open to new developments. We also provide a VSP/SVOS plugin to provision persistent VSP storage in containers.

Prediction 3: Analytics and Artificial Intelligence

One of the primary objectives of IoT platforms is to gather data for analysis, which can also be learned and automated through AI. In 2018, we will see more investment in analytics and AI across the board as companies see real returns on their investments.

According to IDC, revenue growth from information-based products will double the rest of the product/services portfolio for a third of Fortune 500 companies by the end of 2017. Data monetization will become a major source of revenue as the world will create 163 zettabytes of data in 2025, up from 10 zettabytes in 2015. IDC also forecasts that more than a quarter of that data will be real-time in nature, with IoT data making up more than 95-percent of it.

Preparing a wide range of data for analytics is a struggle that affects both business analysts and the IT teams that support them. Studies show that data scientists spend 20% of their time collecting data sets and 60% of their time cleansing and organizing the data, leaving 20% of their time for doing the actual analysis.

Organizations are increasingly focused on incorporating self-service data preparation tools in conjunction with data integration, business analytics, and data science capabilities. The old adage “GIGO” (Garbage in, Garbage out) applies to analytics. This starts with data engineering: cleansing, shaping, transforming and ingesting data. Data preparation: refining, blending, preparing and enriching; before analytics can build, score, model, visualize, and analyze.

AI has become mainstream, with consumer products like Alexa and Siri. Hitachi believes that it is the collaboration of AI and humans that will bring real benefits to society. Through tools like Pentaho Data Integration our aim is to democratize the data engineering and data science process to make Machine Intelligence (AI + ML) more accessible to a wider variety of developers and engineers. Using tooling like the Pentaho Data Science Pack with R and Python connectivity are steps in that direction. Hitachi’s Lumada IoT platform enables scalable IoT machine learning with flexible input (Hadoop, Spark, No SQL, Kafka streams) and standardized connections that can automatically configure and manage resources, and provide flexible output (databases, dashboards, alert emails, and text messages) and, in addition to Pentaho analytics, is compatible with python, R, and Java for machine learning.

This is an area where IT departments will need to learn new languages and toolsets for data preparation, analytics and AI. Integrated tools, like Hitachi’s Pentaho, will greatly reduce the learning curve and the effort.

In my next post, I will look at how data requirements are expected to shift in 2018, and the tools needed to address the coming changes.

share us your thought

0 Comment Log in or register to post comments