An Approach to Maximise Efficiency and Cost Savings
CIOs face complex technology challenges without a budget increase. Businesses must optimise data integration due to economic uncertainty. foryouandyourcustomers offers a method to optimise data integration, starting with a foundational assessment and two implementation stages. The assessment focuses on five key pillars: data integration licensing model, current capacity vs. utilisation, integration patterns, reusability, and risk management around API governance. Aligning with budget constraints ensures future-proofing and success.
CIOs face a significant challenge in managing the increasing complexity and supporting the expanding technology landscape without a corresponding budget increase. A substantial portion of their budget is devoted to maintaining and supporting legacy applications.
As the global economic landscape remains uncertain and the threat of a recession looms on the horizon, businesses are increasingly under pressure to re-evaluate the value and return on investment of their digital and data initiatives. In these challenging times, the "do more with less" approach is becoming more prevalent. This requires a focus on optimising the data integration layer that serves as the backbone of innovation solutions and mission-critical processes.
All too often, digital transformation efforts aimed at achieving quick wins leave behind a legacy of technical debt and convoluted business processes that not only hinder the intended benefits but also negatively impact profitability.
Furthermore, certain technology providers may use data integration licensing models that charge based on the highest level of demand for infrastructure or capacity requirements, which may not be suitable for the specific needs of your organisation. This misalignment can stifle cost reductions and hinder the positive cost degression effects typically associated with scaling and expanding your business.
Now, more than ever, it is time for organisations to re-evaluate and optimise their data integration strategies to better plan, manage capacity and make the most of this crucial layer.
Addressing our clients' requirements, foryouandyourcustomers has devised a method aimed at enhancing efficiency and reducing costs associated with Data Integration. This approach kicks off with a foundational assessment, succeeded by two implementation stages.
The initial stage of implementation focuses on activities with direct cost implications. Here, the focus lies on the optimisation of the existing data integration framework, as well as the diversification of technology to draw from a wider array of solutions.
The subsequent stage extends to the execution of findings with long-term cost ramifications, prominently those tied to risk mitigation and security measures. Here we are focusing on tackling potential threats, vulnerabilities, and uncertainties spawned by the uncontrolled growth of inadequately managed APIs.
The Foundational Assessment – Analyse Your Current Integration Landscape, Identify Inefficiencies and Costs Reductions
The first step is to conduct a thorough assessment of the existing and future data integration processes and tools. The aim is to identify any inefficiencies, redundancies or areas where costs can be reduced. Looking for opportunities to streamline workflows and eliminate unnecessary steps.
In a Data Integration landscape, with the current use of Data Integration Technology vendors, there are five key pillars with a higher propensity to encounter inefficiencies and cost reduction opportunities.
Five Key Pillars in the Foundation Assessment
The Assessment Report
The output of the foundational assessment is a comprehensive report, that unveils the potential for significant cost reductions, supported by a clearly laid out business case.
In parallel, the report outlines a technology diversification plan and suggests optimisation measures for the current data integration layer. These activities will be included in an implementation roadmap.
Lastly, the report prioritises risk management by providing an in-depth assessment of potential vulnerabilities in the system. It offers effective strategies to mitigate these risks, ensuring the security and integrity of your data.
In the following section, we will delve into the five pillars that can help to identify inefficiencies and opportunities for cost reduction.
Data Integration Licensing Model
The cost structure for data integration services can vary depending on the provider you choose. Depending on your business model and ongoing projects, one licensing model may be more advantageous than another.
For instance, in some situations, you may need to license compute cores to facilitate integration, whether it's done on-premises or through a cloud-based integration platform (iPaaS). Alternatively, certain providers may charge based on a subscription or consumption-based model, where costs are determined by the amount of data processed or the number of API calls made.
Regardless of the model chosen, it is crucial to reassess both, your current and future business initiatives, to determine which licensing approach aligns better with your present needs and potential growth.
Key areas to analyse during this assessment include:
Current licensing model in use.
Upcoming renewal considerations.
Comparative analysis of different licensing models.
By conducting this evaluation, you can make informed decisions about the most suitable data integration licensing model for your organisation, ensuring optimal cost-effectiveness and alignment with your evolving requirements.
Typical License Model of Data Integration Solutions
Current Capacity vs. Utilisation
Capacity vs. utilisation analysis is a comparison of the available capacity of your data integration platform, mostly defined by a licensing model, against its actual utilisation.
It is a crucial process for understanding how efficiently and effectively resources are being used to meet demand or workload requirements.
The analysis involves comparing the capacity and utilisation levels to assess the performance and efficiency of the integration layer.
Current Integration Capacity vs. Utilisation
Three possible scenarios can emerge from this analysis:
Underutilisation: This occurs when the utilisation level is significantly lower than the capacity. It suggests that the systems or resources are not fully utilised and may have excess capacity that is not being used efficiently.
Optimal Utilisation: In an ideal scenario, the utilisation generally matches the capacity, indicating that the system is operating at its maximum efficiency without being overloaded or causing performance issues. Achieving optimal utilisation is a desirable goal for resource management.
Overutilisation: Utilisation exceeding capacity is overutilisation. This indicates that the system is being pushed or will be pushed beyond its limits, potentially leading to performance degradation, slowdowns, or even failures. Overutilisation can be a sign that additional resources, capacity expansion or a different licensing model is necessary to meet the workload demands effectively.
After conducting this analysis, if it is determined that either "Underutilisation" or "Overutilisation" is the result, it offers a clear indication of areas where potential cost reduction options can be explored.
In the case of “Underutilisation", cost reduction opportunities may involve reducing licenses during renewals.
On the other hand, in scenarios of “Overutilisation", identifying potential cost-saving measures becomes essential to optimise resource allocation and avoid inefficiencies or exploring benefits under license diversification.
A real-world example
One of our clients encountered the following situation: Several scheduled end-of-day batch jobs were running on a platform with fixed dedicated computing capacity. Consequently, the paid capacity remained unused during the day. By migrating the batch process to a specialised platform with a consumption-based licensing model, we were able to free more computing power for other tasks and achieved optimal utilisation.
Example illustrating the inefficiency of running scheduled jobs on a capacity-based license model.
Integration Patterns in Place
The Data Integration Platform is an integral component to reach innovative solutions and mission-critical processes. When implementing digital changes with a focus on quick wins, it's important to consider the potential negative impacts on profitability and overall success. Rushing through the process can result in technical debt and disjointed business processes, ultimately hindering the original goal of achieving quick wins.
Examples of Architectural Patterns for Data Integration
This is why it is important to explore the current integration patterns that are in place and contrast them with industry best practices.
Depending on the actual business requirements, some examples of architecture patterns that can provide a better outcome at a lower cost, include:
Integration complexity: Integrations will vary on the level of complexity, but complex integration technologies will make all integrations complex. Unnecessary complex integrations will inevitably become costly to implement, maintain and evolve.
Finding the best technology that aligns with the business integration requirements with maturity and simplicity is crucial when picking the right tool for the job, as it will bring a higher ROI.
Master Data Hub: Allows data synchronisation, deduplication, enrichment, cleansing, validation, etc. This enhances the ability to centralise data to simplify business decisions.
Some mature Data Integration technology providers will offer out-of-the-box solutions that will dramatically reduce costs and allow high data quality, compared with home ground developments that are costly to implement and maintain.
API-led Connectivity: Provides a layer-based approach to separate system connectivity with the business process and experience flows. Also, this approach allows a higher level of reusability, which leads to further cost reductions.
Modern Event-driven Architectures: Modern data streaming technologies allow to simplify the streaming of events to allow near real-time business decision-making at lower implementation and maintenance costs.
Large Data Movements: Batch processing remains one of the most common, yet worst-implemented architectural patterns. Most companies have sub-optimal and expensive batch-processing solutions in place, that not only increase costs but also limit business agility.
Most Integration technology vendors will claim to support “batch processing”. However, the level of maturity, capability and sophistication varies widely among them, which leads to unnecessarily higher costs of implementation and maintenance.
Micro-services Architecture: When executed correctly, this type of architectural style involves dividing applications into smaller, independent services that can be developed, deployed and scaled separately. This leads to increased discoverability and reusability. However, if not done properly, this leads to expensive development cycles, with multiple duplication effort, that increases costs and increases security vulnerabilities.
The immediate outcome of this analysis is an indication of different candidate architecture patterns to be applied to achieve the right level of business requirements with:
Lower costs - Identified by requiring less infrastructure.
Smaller investment - With ongoing development phases.
Lower operational costs - With simpler maintenance effort.
Reusability and Consolidation
Reusability and consolidation play a crucial role in optimising business operations, fostering innovation and enabling cost-effective growth.
Unfortunately, APIs and integration flows in general are often poorly governed, with minimum discoverability, leading to constant duplication of effort. This results in unnecessary phases of development and testing, leading to higher maintenance and operational costs.
Reusability allows businesses to leverage existing resources, processes and solutions instead of creating new ones from scratch. This enables faster development and deployment of new projects, leading to lower costs and quicker time-to-market.
It is crucial to analyse the level of discoverability and reusability of APIs and integration flows. This is as simple as auditing a full inventory of APIs, which most companies will fail to provide.
Reusability and consolidation of APIs and integration components provide a clear pathway to:
Extensively reducing costs - Removing unnecessary development and testing phases.
Higher-quality solutions - Fewer errors and a clear reduction of defects.
Simplifies maintenance and support - Leading to fewer points of failure and easier troubleshooting.
Risk Management around API Governance
Data breaches and cyber-attacks can disrupt normal business operations, leading to downtime, productivity losses and service disruptions. The longer it takes to restore systems and resume operations, the greater the financial impact. Additionally, businesses may suffer reputational damage and customer churn if they fail to respond effectively to the incident.
Recent Data Breaches in the News
Also, depending on the nature of the attack and the data compromised, organisations may face legal and regulatory consequences. This can include fines, penalties, litigation and the cost of legal defence.
However, despite an increased number of cyber-attacks on companies with a strong history of protecting customer data, most companies are not following simple steps to avoid being victims of security breaches. On the contrary, most organisations, big and small, are constantly increasing their risks of being targets. This is due to the constantly increasing use of data integration workflows and API adoption.
API adoption growth has led to a problem called “API sprawl”, which refers to the uncontrolled proliferation and management of APIs within an organisation. It occurs when there is an excessive number of APIs, often created by different teams or departments, without proper and consistent security, governance and documentation.
According to Salt Labs (https://salt.security/api-security-trends), just in 2022, malicious API traffic increased by 681%! Salt Labs also reported that more than 34% of organisations do not have correct API Security in place, having more than 91% of APIs openly exposing Personal Identifiable Information (PII) and sensitive data to threat attacks.
Ironically, it is not hard nor expensive to mitigate the risk of data breaches and cyber-attacks.
This phase of the assessment focuses on identifying the level of maturity for risk and vulnerability management around the full enterprise API and integration ecosystem.
It focuses on analysing the level of maturity in the following areas:
Enterprise API Inventory – Ability to scan all APIs in use vs. those deprecated, including information such as API versions, policy enforcement, sensitive data management, risk factors, etc.
End-to-end Application Flow Transaction Visibility – Graphical representation of traffic, tracing, bottlenecks, errors, insights, etc.
Drill-Down Troubleshooting - Ability to easily drill down into a specific API or integration security analytics and log aggregation.
API Attack Management – Runtime attack identification and ability to automatically blocking of attacks.
Post-mortem analysis – Ability to root cause analysis, identification of attacks, security postures, exposure discovery and remediation recommendations.
Vulnerability management and risk mitigation play a critical role in maintaining the security, integrity and resilience of an organisation's operations and reputation in today's complex and evolving threat landscape.
Early detection and remediation of vulnerabilities are generally less costly than dealing with the aftermath of a security breach. Effective vulnerability management allows organisations to address security issues before they are exploited, reducing the financial impact of potential incidents, such as recovery costs, legal fees, fines, and reputational damage.
Businesses aspire to move swiftly, often outpacing IT capabilities. With limited resources available for using complex and outdated integration technologies, IT struggles to meet the demands of business users.
The Foundational Assessment presents an opportunity for IT leaders to revisit their initial vision, considering the inevitable market changes and any deviations in business goals. It allows them to assess how the Data Integration layer has been implemented and rectify any discrepancies with those business objectives, ensuring future-proofing and compliance with current and future budget constraints.
How can we support you?
At foryouandyourcustomers, we understand the unique challenges and opportunities inherent in your Data Integration journey. We've crafted an efficient and cost-effective method that ensures your business reaches its full potential.
We advocate starting with a comprehensive foundational assessment to map out your current landscape and to understand the potential for growth and cost-saving.
A 3-Step Approach starting with the Foundational Assessment
Our approach, designed with our client's needs in mind, encompasses two key stages of implementation. Initially, we focus on immediate cost-impacting activities, aiming at optimising your current data integration framework and advocating for technology diversification. Following this, we shift focus towards executing strategies that address long-term cost effects - predominantly focusing on risk mitigation and security measures in the face of potential threats and uncertainties, particularly those associated with poorly managed APIs.
We're eager to support you through every step of this journey and look forward to helping you optimise your Data Integration landscape for efficiency, cost-effectiveness, and future growth.
Get in touch with the author
Interested in enhancing your Data Integration Landscape? Don't hesitate to reach out to our expert and author of this article, Carlos, for tailored advice and solutions.
Carlos R. Iturria
Experienced Tech Leader who bridges the gap between technology and business, specialising in Integration, API management, and Security practices.
Supports his clients in their digital change with passion and practical experience.