A heuristic is any approach to problem solving or self-discovery that employs a practical method that is not guaranteed to be optimal, perfect, or rational, but is nevertheless sufficient for reaching an immediate, short-term goal or approximation. For investment managers, heuristics are used to speed up valuation analysis and ultimately investment decisions.
When working with financial services clients, our approach is typically to look at a company’s enterprise value (EV) in relation to its income e.g. EV/Sales. This gives a quantifiable metric of how to value a company based on its sales whilst taking account of both the company’s equity and debt. Other typical heuristics are EV/Gross Profit, EV/NOPAT (net operating profit after tax) and EV/Capital Employed, all of which can be taken as indicators of a company’s value at acquisition.
Companies typically use Excel to perform these analyses, consolidating information from a range of different data sources, both proprietary and 3rd party, to show trends of the heuristic over a set period. They often compare the heuristic for an individual company vs a benchmark (industry index, portfolio, etc.) or vs a selected group of comparative companies. However, using Excel is yesterday’s solution for today’s problem, one that simply cannot keep up with large datasets and complex variables.
Firstly, Excel is a very manual approach, requiring regular human intervention to refresh data, add new companies or variables, distribute insights and switch benchmarks. The larger the spreadsheet, the more prone it is to errors and inconsistencies which are difficult to detect and time consuming to trace their origin due to several layers of nested functions.
Making updates to the spreadsheet is equally time consuming and can lead to off-target effects, i.e. the wrong metrics being inadvertently affected. Circular dependencies are common in complex reports and difficult to update retrospectively. In turn version control can be problematic, it is hard to know what report was used when which is important when reflecting on historical data and if the updates don’t go according to plan, roll-backs to a previous working version can be awkward.
Excel reports will inevitably grow as more history is added, new data sources are integrated and new heuristics are calculated. Excel’s performance rapidly deteriorates as it reaches its limits and companies often need to reduce granularity of their reports, shifting from daily to weekly or even monthly data points. While this might make an analyst’s life easier, the quality of the investment decision will suffer, as the analysis becomes further removed from day-to-day performance of any given business.
Of course, we understand Excel has dominated many investment decks for the best part of 30 years, and it can be daunting to move to a brand-new solution. But it’s not a failing to recognise that an automated solution can outpace a manually curated spreadsheet – it should be an opportunity to improve business processes and free up investment manager to make key decisions, rather than be bogged down testing the accuracy of formulae.
QuantSpark’s approach can be split into 4 key components, with a strict data quality framework throughout the process to ensure accurate insights from high quality data and robust monitoring of all steps in the analytical pipeline to flag and rectify any issues as they emerge.
We carefully interrogate complex Excel reports, breaking heuristics into their base constituents, map the flow of information through reports and audit all calculations. We then flag up inconsistencies/errors and recommend improvements to calculation methodologies. Clients receive a full set of documentation, helping them better understand the logic which drives their key reports.
Depending on the scenario, we can use off the shelf ETL tools or custom python extraction workflows to automatically pull data from any source. Pipelines can be scheduled in a flexible manner meaning data is always fresh and costs are kept within desired threshold.
We encode all Excel logic (using DBT) meaning data is automatically transformed whenever new data is received. All calculations are standardised to allow consistency across all reporting. Logic is well documented and easily accessible which ensures complete transparency amongst all users.
We replicate/enhance existing reporting in Cloud-hosted dashboards (Tableau, PowerBI and Looker). The toggles enable quick and easy comparisons of companies to benchmarks and the filters ensure you can dig into your data to better understand trends. Due to this sharing insights becomes extremely easy and access can be strictly controlled.