Operating in a volatile market, stock exchanges in Asia Pacific (Apac) need an agile data management strategy to keep pace with changes. Shanmuga Sunthar, Apac director of architecture and chief evangelist at Denodo, tells DigitalEdge why this is so and how stock exchanges can succeed with data.
What common data management challenges do Apac stock exchanges struggle with?
Although stock exchanges don’t own the data they handle, they serve as data controllers and processors, bearing legal responsibilities. Compromising on the quality and accuracy of the data sets — whether via access rights, application programming interface maintenance, or cross-border data flows — could expose them to legal and reputational risk.
Ensuring governance and security is challenging due to the vast and diverse data involved, including personally identifiable information, trading data, and corporate information. To stay compliant and secure, stock exchanges must maintain a holistic view of their data within a governed environment despite the high costs and significant effort required. This is highly crucial in the time-sensitive stock market.
However, Apac stock exchanges are still highly dependent on sub-optimal data strategy and traditional data practices from previous early adoptions of computerisation and digitisation. Continuous use of legacy systems, multi-cloud strategy, and software-as-a-service applications add to the already distributed data ecosystem, which increases complications that stock exchanges need to overcome to deliver complete, unified, governed and right-time data to their many innovative use cases and projects.
Ultimately, an un-optimised data management system impacts a stock exchange’s ability to innovate, whether it is data-as-a-service or for data products. Spending too much time on data preparation creates latency issues, thereby reducing the speed of execution and defeating the purpose of building products/services that require real-time data delivery.
See also: Tips to fortifying your organisation’s cyber defence (Part 2)
How will those challenges affect the use of emerging tech like generative AI (GenAI) and high-performance computing (HPC)?
Both GenAI and HPC rely on high-quality, accurate data to function effectively. They require a unified view of all data sources to process and analyse information in real time. Without this, stock exchanges will limit the potential of these technologies, leading to inaccurate results that hinder innovation and integration.
Poor quality data leads to incorrect or inconsistent output. As the saying goes, “garbage in, garbage out.” This applies whether GenAI is training large language models or accessing real-time data through Retrieval Augmented Generation. Such issues can cause hallucinations, leading to end-user mistrust.
See also: The AI revolution in family offices
Data governance and security are crucial for stock exchanges adopting GenAI, especially as new regulations emerge to address transparency, bias, intellectual property, privacy, and security concerns. These are concerns that regulations (like South Korea’s AI Act or Singapore’s proposed Model AI Governance Framework for Generative AI) aim to address.
While managing these regulations can be challenging, inadequate measures risk breaches, legal ramifications, and a loss of trust, damaging the reputation of stock exchanges when implementing generative AI solutions.
What should stock exchanges consider when building a modern data management architecture?
Stock exchanges should prioritise building a modern data management architecture that offers both accuracy and agility, while still maintaining governance and security. Some key factors that they should consider:
- Having a unified data delivery layer
A logical “data-first” strategy that integrates data sources onto one access layer provides a holistic view of data. This helps prevent the latency issues and maintenance challenges that come from data needing to pass through multiple hoops.
- Decoupling data sources from end-users
This gives flexibility to data teams when adding or removing data sources, leaving stock exchanges free to modernise their network whenever necessary without bottlenecks in the process.
- Intuitive platform use
Low-to-no code platforms allow for fast onboarding and less reliance on IT and data teams when it comes to pulling data. Users can, therefore, run queries using natural language, democratising data access across the organisation.
- Analytics capabilities
Any data management infrastructure that stock exchanges implement must be equipped to handle the processing of large datasets for analytics.
- Application programming interface (API) creation
Simple and secured API creation allows stock exchanges to provide secured services/interfaces with external applications.
What are the benefits of logical data management over a centralised one?
Centralised architectures alone are often not enough to provide the optimal data capabilities stock exchanges need to keep up with dynamic business needs, privacy regulations and technology evolutions. Stock exchanges have been trying to establish the best ecosystem for enterprise data delivery since the beginning of the digital era, and centralised data systems may have suited the data requirements of that time, but now the game has changed.
To stay ahead of the latest tech trends, click here for DigitalEdge Section
The reality is that data nowadays is too big and too distributed. It is frequently not feasible (or even desirable) for companies to migrate all of their data onto a single system. The costs and time needed to undertake such a task when current data sources are residing in multiple locations are often not worth the potential benefits centralisation might bring. Additionally, the diverse requirements of modern data analysis make it unrealistic to rely on a centralised “one size fits all” approach for every organisation.
Meanwhile, logical data management systems allow organisations to manage disparate data sources via single-layer fabric, decoupling data access from its actual physical location. Advantages of this include:
- Data access through semantic models, allowing data consumers a data retrieval method that is intuitive to their needs.
- Ability to reinforce common data policies even over disparate data sources, at better costs/efforts compared to applying them to every source system.
- Seamless technology evolution and infrastructure changes, such as the transition to the cloud.
The Denodo Platform realises the vision of a Logical Data Fabric as a unified data delivery platform that bridges the gap between IT infrastructure and business applications. Through data virtualisation, Denodo enables queries across multiple data sources — whether traditional (such as data warehouses) or non-traditional (like APIs) — while presenting itself as a single logical data source to the consumer.
How has Denodo helped an Apac stock exchange build a modern data management architecture?
The Indonesian Stock Exchange (IDX) initially faced challenges in its data preparation and delivery. Having massive volumes of daily transactional data (up to 23 billion shares are traded on IDX every day, including stock and bond trading data) spread across distributed, standalone systems meant IDX struggled with integrating its diverse data structures, which included both internal and external sources in a complex network. This impacted IDX’s ability to create new revenue streams via additional data services and products.
Since the Denodo Platform centralised the management of distributed data, IDX’s clients gained much faster access to market data. By integrating real-time messaging and streaming data with API sources, Denodo can support IDX’s new Market Data System initiative by improving data management and governance through a single logical layer, accelerating information delivery and shortening data preparation time.
IDX now delivers data-as-a-service through streaming, messaging, and APIs — with the latter having over 175 currently in use, some of which are utilised for new data services that help IDX generate incremental revenue streams. Through Denodo, IDX managed to optimise its data management systems in a way that brought in business growth and innovation.
What else is needed in an effective data management strategy?
Data is the backbone of stock exchanges. It is crucial that stock exchanges drive towards a holistic data management strategy that can deliver data at the speed of business, in the language of business.
A robust, modern data architecture is the enabler, but stock exchanges must not neglect other factors of an effective data management strategy, such as data literacy, governance, quality control and execution speed. Stock exchanges need to work closely with both internal and external stakeholders to fine-tune their data management strategy. Keeping open channels of communication, building strong partnerships and cultivating data expertise will all add towards a stock exchange’s ability to process, make sense of and leverage key market data for its business goals.