Building a Strong Foundation for AI Starts and Ends With Data

Modern and smarter technologies are reshaping the world of business. Today’s organisations rely heavily on data-driven insights and information to help them make better decisions, faster.

Over the years, Artificial Intelligence (AI) has become a transformational technology which enables more tasks and workloads to be automated, providing results at a much faster pace for data-driven organisations and leaving behind those who do not keep up with the trend. At a time when businesses are having trouble keeping up with the exponential growth of data, AI can also be used to automate the entire data management process.

“Trust”, however, is a critical success factor, especially considering the fact that some organisations are still sceptical about leveraging fully on AI. According to Rodney Regalado, Business Unit Executive, IBM Storage Systems, ASEAN, in today’s world, AI revolves around three themes – automation, natural language processing and trust.

For automation and natural language processing to work, trust has to be established. Before businesses can truly rely on AI, they need to be able to trust the information it’s giving them. Rodney highlights the use of AI in facial recognition as one example where underlying factors such as trust, privacy and policy have to be addressed before the technology can be made useful.

“The more we trust AI, the more we can depend on it. The ultimate purpose of AI is to unlock the value of data. AI adoption is about getting the benefit of predicting future outcomes and automation for a smarter workforce. A lot of modelling has to be done to reimagine new business models. Ten years back, no one would have imagined that we would be having virtual meetings. Today, it’s the main tool. Just like AI, ideas take time to conceive. The reimagination of business policies for AI will only enhance business”, said Rodney.  

With AI, Quality Over Quantity Matters
It is not a question of whether businesses can afford AI or not. Rodney explains that for any organisation to implement any form of AI, they can start on small projects. Be it large enterprises of SMEs, AI will provide an added value for organisations regardless of their size. But to start with AI, there is always on option to begin in a narrow-based form, leading to a bigger, broader picture.

Rodney elaborates on this idea by highlighting the fact that data underlines AI. Starting small means that businesses can work on data sets that are more manageable and can provide maximum value with minimal costs needed upfront.

However, the ability to handle and manage increasing volumes of data continues to be a huge challenge for many organisations. AI requires a high quality of input to work on the right data sets and extract values. Without this, the ability to ingest, process and classify data effectively isn’t going to be great.

At the same time, the quality of data also plays an important factor in determining the output. So whichever way you look at it, the effectiveness of AI depends highly on the use of the right data sets. As data typically comes from multiple sources and two types, structured and unstructured, managing data becomes a critical factor. Data needs to be organised in an efficient manner.

Besides that, there is still a huge shortage of skills when it comes to AI. It is a relatively young field with not enough skilled resources around it. Hence, finding the right skills will be a challenge for many companies looking to excel in AI.

IBM’s AI and Data Management Prowess
Rodney is responsible for enabling businesses to harness the value of data and AI across hybrid multi-cloud environments. Based on his interactions with customers, he believes that first and foremost, it’s important for businesses to understand the AI ladder and how data plays the most important role in it.


To help businesses gain better control over their data and the much-needed “quality” for successful AI implementations, Rodney explains that having a solid foundation for storage is key. Without the right foundation, using AI for data analysis is no easy task. Rodney believes that IBM’s portfolio of storage and data management solutions may be the answer.

“The ability to handle masses of data requires a scalable platform. At the same time, businesses need to be able to filter a lot of their data. IBM solutions can help manage this. As we go through the AI pipeline, the availability to organise and learn from data, IBM has technologies to cater to this. In the context of storage, IBM can help build a foundation for AI in the organisation”, explained Rodney.

Rodney suggests the following IBM solutions for businesses to address the challenges they face with their data storage and how they can use machine and deep learning technologies to analyse data efficiently.

  • IBM Spectrum Storage Suite addresses many problems faced by modern businesses, especially scalability issues that come when handling massive amounts of data. It offers licensing on a flat, cost-per-TB basis, making pricing predictable as your business capacity grows.

  • IBM Cloud Object Storage is able to manage data efficiently and effectively across the multiple forms of unstructured data. With built-in high-speed file transfer capabilities, cross-region offerings and integrated services, IBM Cloud Object Storage can help you securely leverage and manage your data no matter how geographically dispersed it is.

  • IBM Spectrum Discover organises, tags, classifies and analyses data properly. Businesses will be able to easily identify, prepare and optimise file and object data for faster business and operational results. IBM Spectrum Discover with its powerful ingest, data mapping, visualisation and data activation is critical to creating a successful and optimised AI infrastructure.

  • IBM Elastic Storage Server (ESS) uses machine learning and deep learning technology to infuse data for insight. ESS is a modern implementation of software-defined storage, combining IBM Spectrum Scale software with IBM POWER processor-based I/O-intensive servers and dual-ported storage enclosures. IBM Spectrum Scale is the parallel file system at the heart of IBM ESS.

  • IBM Spectrum Scale is the parallel file system at the heart of IBM ESS. IBM Spectrum Scale scales system throughput as it grows while still providing a single namespace. This eliminates data silos, simplifies storage management and delivers high performance. By consolidating storage requirements across your organisation onto IBM ESS, you can reduce inefficiency, lower acquisition costs and support demanding workloads.

Seeing as how organisations are facing a shortage of talent for so many of these critical skills, Rodney said that IBM is working with education institutions to train and create more skillsets to deal with data and AI. After all, AI will not replace human jobs, but rather, transform the working experience by automating a lot of jobs.

Getting the Most Out of ALL Your Data
On top of all this, with hybrid multi-cloud becoming the norm, Rodney added that organisations need to ensure they have the right cloud foundation. Companies should focus on democratising data and make it accessible to all. Only then, Rodney feels that companies can be able to break data silos and do away with stranded data.

“In a multi-cloud environment, data can be on-premises or in the cloud. How do you break barriers or data silos? In fact, as high as 80% of data remains inaccessible to organisations. Businesses need to simplify and virtualise their environment to address complexity and costs. If you operate in a hybrid multi-cloud environment, the availability to manage efficiently will be a challenge”. commented Rodney.

To find out more about how IBM can help your business manage your data and storage efficiently and leverage on AI to get the best insights from it, click here.

You might also like
Most comment
share us your thought

0 Comment Log in or register to post comments