The phenomenal uptake of artificial intelligence in financial markets was always going to be a given. After all, what’s not to like about faster, cheaper insights gleaned from bigger-than-ever datasets? It was especially so with the emergence a couple of years ago of the next step - generative AI.
While gen AI may sometimes behave like an overconfident teenager, there is no argument that it replenishes a battle weary capital markets sector with energy, and introduces new ways to gain ground lost to the challenges of intense competition and regulatory pressures.
So, in a sector driven by technology and all of its latest innovations, it can be easy to overlook the people component. Data is king but it is still people who make the difference.
Clients tell us that while they might be confident about their trading strategies and technology, more than ever they are focusing on robust training and ongoing change management capabilities to make sure their people understand and are on-board with the increasing complexity and significant transformations taking place.
We are also seeing an interesting change in conversations about data quality, which are moving from the middle to back office and right up to the C-suite. A few years ago we would have been dealing with the head of data procurement on pricing. Today, chief revenue officers and chief AI officers want to be in the room to set value creation and value capture imperatives to focus more on source data quality as it sets the foundation for the source code.
Sharing knowledge and skills
One important trend has been our clients’ move to reinvent themselves in the face of competition from fintech startups as well as the challenges of new and changing regulations and constantly evolving technology.
We are seeing more interest in buying or renting off-the-shelf solutions that are not core to the business (such as customer relationship manager (CRM) software). But when it comes to core capabilities, there is increasingly a preference to develop custom solutions collaboratively rather than using a ready-made product.
While there are benefits to buying infrastructure outright - such as rapid deployment and usually predictable costs - there remains much to like about building your own. It’s not only the chance of endless customisation but there are significant efficiencies in having control over system updates, security and maintenance, and in creating a competitive advantage.
Despite compelling arguments for both buying and building, the fact is infrastructure decisions are no longer confined to one or the other. We are seeing more hybrids that marry best-of-breed solutions from the market with clients’ own builds. A hybrid approach balances custom solutions with ready-to-use products, helping to manage costs and mitigate risks.
This approach can lead to deep-rooted partnerships that take the best of knowledge and skills from all parties to create data insights that allow clients to reimagine their offerings and deliver more hyper-personalised products. It means that differentiation is about connecting ideas, relationships and capabilities, not just the ability to outspend.
Real-time demand
The demand for market and transactional data has never been greater, and the need for real-time market data is increasing, driven partly by continued regulatory and compliance pressure.
Our clients are asking for readily available access to market and transactional data and audits either through our data lakehouse or moving to a self-serve model and ingesting data into their own lakehouses to run regular reporting or answer regulator queries.
It is quite a difference to just 12 months ago when delayed information – T+1 or end-of-day (EOD) was enough. An example of this is sanctions data rules. Previously, sanctions breach monitoring was run EOD as a report and often post-trade. Now clients are being pushed to look at "near-real-time" capabilities and alerts. Regulators, too, are helping to drive the transformation by requiring real-time data for compliance assessments.
Of course, gen AI and machine learning are also delivering innovation in data interrogation. One area of interest is trade surveillance, where we are able to ‘learn’ the trading behaviour of dealers. From a compliance perspective, for example, that means we can spot anomalies and trigger a signal.
Trade surveillance is also used to generate alpha signals by providing long-dated high-quality historical data, and tick-by-tick information down to the microsecond. It also allows portfolio managers to very quickly change trading strategies, contingent on a more extensive data set that wasn't previously possible.
This chance to drive efficiency and improved decision-making is also playing out in a reengineering of business processes – saving costs as well as creating growth. For example, there are widespread moves to merge the functions of dealing desks and portfolio managers by leveraging gen AI. We’re also seeing the use of gen AI to spin off new revenue streams, tailoring data assets to individual clients.
Looking ahead, copilot systems are expected to be a source of new growth and innovation. Without exception, our clients aspire to create a copilot to better serve their own clients by tailoring every single product and service to an individual rather than a segment.
Copilot systems rely on extensive and high-quality data sets delivered by stable and reliable infrastructure - a capability that Iress is proud to promote.
Navigating the complexities of global markets may depend on the data and integration of technology, but human expertise and spirit will make it happen in the right way and truly push the limit.
This article was originally published in SIAA Monthly September 2024.