Clear Path Analysis recently held a webinar delving into a topic close to Vidrio's heart: addressing the data management hurdles faced by allocators. This blog post includes key insights from the webinar, shedding light on the challenges allocators currently encounter in managing data effectively.
By: Nick Bourne, Commercial Director, EMEA
Institutional investors are expected to excel in portfolio monitoring and alpha generation activities, yet they frequently grapple with substantial data obstacles that consume valuable resources. Conversations between Vidrio and allocators in the industry reveal the primary data challenges that often plague asset allocation efforts. The top ones are listed below:
- Data systems are in place for one asset class, but others lack this same intelligence.
- Investment teams spend 90% of their time on manual data tasks like collecting, normalizing, and uploading data to our systems. Accuracy suffers.
- We want to scale our portfolio investments significantly but can’t trust our data, so until that’s under control, we’re pushing any further investments out.
Clear Path Analysis recently designed a webinar titled Utilizing Data Volumes: Driving Profitability, Efficiency, and Risk Management, featuring insight from Gayathri Pandurangan, Senior Director, Head of Innovation Engineering, KPMG, Guillermo Donadini, former Chief Investment Officer – General Insurance, AIG, Sahem Gulati, Head of Strategy & Consulting, M&G, Robert Cataldo, Chief Investment & Strategy Officer, UFG Insurance, and moderated by Rob Weber, Global Head of Solution Sales, Rimes Technologies.
Throughout the webinar, the speakers delved into the significance of establishing a data-centric approach within institutional investing, emphasizing the growing emphasis on quality data management and the comprehensive capture of diverse datasets. Vidrio offers its unique perspective on these crucial aspects discussed during the Clear Path Analysis event.
At the onset of the webinar, we revisited the fundamental principle of "garbage in, garbage out" in data management. Allocators must prioritize ensuring the reliability of their data sources to effectively shape future allocation strategies. Both Robert and Sahem emphasized the critical role of data as the lifeblood of institutional investment organizations. With increasing regulatory pressures on allocators, there is a growing demand from front-office investment teams for enhanced access to robust data pipelines.
The panelists unanimously agreed on the importance of a cohesive strategy that seamlessly integrates data and technology to drive transformative change within the investment landscape. Robert highlighted the daunting nature of this transformation for many, as rapid technological advancements leave teams with a choice: innovate or risk falling behind. The ultimate objective is to elevate data into actionable insights with tangible impact.
Gayathri and Guillermo built on these opinions by further clarifying that data can no longer be hidden in silos, as that will lead to variations in quality and governance. Furthermore, CIOs and investment teams need to think about the data they need to meet strategic goals. If you’re not consistent across these teams’ data transparency will suffer, leading to more noise and unreliable data sources. For Sahem, data must be outcome-driven, as he’s seen a lot of firms waste time and money without moving the data improvement needle. Culture buy-in for your data strategy and improvement plan is a must, and those plans should be quantified at the time of inception as well as during the expansion of your portfolio.
The structure and dissemination of data became the next topic of discussion. Guillermo began by stating that a cost-benefit analysis is an important piece for investment teams to better understand data costs, feedback loops, and automation. Vidrio aligns quite well on these aspects, as we believe that understanding delivery frequency across public and private market assets helps to improve data transparency and timeliness. This further benefits investment teams by removing the data collection burdens and cumbersome normalization processes. On the automation front, many are cautiously optimistic when entering the field of automation and the impact on strategic allocations. Vidrio takes a hybrid approach to automation, blending both years of human expertise with advanced automation drivers in large language models (LLM) and deep neural networks. Even through our approach, necessary guardrails need to be put in place to avoid AI hallucinations. You can read more about our methodology in our latest blog, where we convey our findings from a recent trip speaking with allocators across the GCC.
Panelists in this Clear Path Analysis webinar were then asked to break down the order of importance when considering unstructured data sets. They agreed upon:
Both Robert and Sahem agreed that existing silos can quickly break down any data improvement plan for allocators. Gayathri said that in addition to the elimination of silos, leadership must commit to viewing data as a strategic asset and commit to the overall improvement journey. For Vidrio clients, that journey is streamlined to the following flywheel image, where an allocator would be onboarded through our document and data collection process through to data verification and consumption. In our view, this removes any data fragmentation or siloed areas from the investment landscape. You can take a deeper dive into Vidrio’s data management solution by clicking on the flywheel image below.
Final thoughts
In the modern landscape of monitoring diverse asset portfolios, it is essential that alternative data remains consistent, whether from private or public market investments. Effective platform solutions should factor in variables like inflation, interest rates, geopolitical risks, fees, and regulatory influences through risk assessments and benchmarking. It is crucial for these elements to seamlessly integrate with existing data streams. Guillermo highlighted the complexities of scaling portfolio data with technology, a sentiment echoed by Sahem, particularly concerning the opacity of private markets. Many allocators acknowledge that data collection is evolving into an art form, requiring time and dedication, transcending mere scientific analysis.
Overall, this webinar presented an excellent use case for the prioritization of portfolio data as a strategic asset across the innovation of institutional investing technology that is happening today. The team at Vidrio agrees that to make data more actionable and reliable, quality and transparency become an excellent starting point in an allocator’s data journey for improving asset allocation balancing efforts, private market clarity, and ESG.
In recent months, Vidrio Financial has produced several blogs, podcasts, and a webinar that touches on the importance of data for the clients that we serve, including OCIOs, asset managers, endowments, foundations, family offices, sovereign wealth funds, and more.
Please be sure to check out some of the highest-recommended pieces of content from the list below. If you’re interested in learning more about Vidrio’s data management flywheel and how we partner with these allocators to improve data management controls, be sure to schedule a demo with our business development teams using the link here.
Top Data Management Resources from Vidrio Financial for Allocators:
Improving Alpha: Mazen Jabban on the Dangers of Delegating Away Allocator Innovation
Webinar: Discovering the Influences of Data Anomalies on Investor Portfolios
Exploring Data Management: The Top Concerns for Institutional Investors