ANNOUNCEMENT

Percipient-HPE Partnership

Percipient and Hewlett Packard Enterprise partner to launch a revolutionary In-Memory Data Virtualization and Integration Appliance architecture.

Built on HPE ProLiantTM systems and UniConnectTM — an industry-leading flagship product from Percipient, this optimized hardware-software architecture is the first of its kind for unifying Big Data aggregation and analytics across unlimited data streams and unprecedented efficiencies in power, scalability, availability, reliability, physical footprint, and total cost of ownership.

Such a NextGen Data Integration ecosystem heralds the arrival of innovative solutions for application/report delivery reliant on varied data sources, data virtualization requirements, unified data access needs and archival scenarios. Productivity will be enhanced across costly OLTP systems, accelerated Extract, Transform Load (ETL) needs, and centrally-hosted infrastructures.

Welcome, Marc

Percipient warmly welcomes Marc Giretto as its Chief Data Officer

Marc has over 20 years of experience applying advanced analytics to solve complex business issues. He helped shape analytic strategy in his role as Global Head of Marketing Analytics for Standard Chartered Bank from 2009 – 2012, and Director of Decision Management for Citi Australia from 2012 – 2014. Moving to ANZ in 2014 as the bank’s Australia Head of Analytics, Marc oversaw multi–departmental analytics teams comprising over 120 staff.

Marc is a mathematics graduate from the UK’s De Montford University in Leicester and is an avid motor sports enthusiast.

CEO’s DESK

A New (Memory-Driven) World

In-memory ecosystems are capable of powering enterprises’ big data ambitions, from data storage and transformation to analytics and business intelligence. Yet the myth remains that memory solutions are too expensive, especially if your data volume is either too small or too large, or your operational needs are non-critical.

In fact, while DRAM prices were edging up in 2016 and most of 2017 due to increased demand, Gartner predicts that prices will crash by 2019 as manufacturers adjust their production and eventually flood the market.

Price trends aside, the longer term advantages of memory computing come to the fore when enterprises put on their big data hats.  Data is doubling every two years and with AI, data queries are also becoming more complex.

This data will become impossible to store and process cost-effectively unless enterprises work out a way to do this virtually.  This is because virtualisation is not a simple comparison between disk and memory costs. In fact, virtualisation software is not only capable of processing data more speedily but also more efficiently and with fewer staging requirements, making such approaches ideal for both small and large amounts of data.

Percipient’s recently signed MOUs with Intel and Hewlett Packard Enterprise is yet another milestone in our commitment to help enterprises build in-memory ecosystems that serves a plethora of next gen data applications.

Data Virtualisation’s Coming Of Age


DV is no longer a well-kept secret but what should you consider before taking the plunge?

Data Virtualisation (DV) enterprise solutions have been around since the early 2000s, but are seeing a spectacular surge in interest.

Such solutions were originally an attempt to solve the EII (Enterprise Information Integration) challenge, ie proprietary platforms that resolutely refused to talk to each other. MetaMatrix was one of the first vendors to create a single abstracted layer to integrate metadata. This was a radical solution designed to help enterprises dramatically reduce the time needed to develop their business applications.

However, early DV solutions lacked the requisite data governance and data security features, and sat uneasily inside inflexible enterprise data architectures. As a result, DV received somewhat of a bad rap for bringing along “challenges with manageability, usability, data quality and performance”

New Age, New Demand

The rise of big data and the race to build more functionality has given DV technologies new-found significance.

A Forrester report tells us that in 2017, over half (56%) of all global technology decision makers surveyed had already implemented, or were in the process of implementing, DV technology, up from 45% in 2016.  Another 20% said they planned to invest in DV over the next 12 months.

Forrester points to three key reasons for DV’s current appeal:

1) DV solutions bring simplicity to an increasingly complex data environment.

Not only are data volumes increasing but so are the varieties of data sources, formats and locations. Coupled with this are the increased demands made on data – from customer applications to AI-powered business insights. DV solutions mask this complexity by offering a single gateway to all enterprise data, and from all enterprise users.

2) DV solutions enable real time analytics –previously the purview of only the most sophisticated digital companies.

By joining real time (eg clickstream, IoT) data with batch-based(eg customer, sales) data,DV technology brings to all enterprises the business insights that digital giants like Amazon and Netflix have long enjoyed. DV does this by “pushing”queries to the data source, thereby achieving high compute efficiency and distinguishing itself from standard data federation technology.

3) The spread of DV solutions has succeeded not just vertically but also horizontally.

Early DV adopters tended to be ones with strong real time data integration needs, eg large scale online retailers. However, DV solutions have now found their way into technologically conservative industries like insurance, healthcare and government.  This is because DV usage has evolved to include more mundane but operationally crucial processes, such as master data management and enterprise-wide data discovery.

Industry Breakdown

So what are the pluses and minuses of adopting DV technology for specific industries?

Healthcare

Healthcare delivery models have moved from a parallel to an inter discipinary approach where patients sit at the center of a satellite of services, research and education. As a result, healthcare data is only becoming more, not less, fragmented. DV enables multiple agencies to access common and updated patient records. This means a single source of truth without unnecessary copying of sensitive data.

However, healthcare IT architects must be cognizant of internet connectivity, something that can become a problem in smaller communities and rural areas.  Many healthcare practitioners will also need to be convinced that DV platforms meet the strict user access controls expected of any health records environment.

Retail

Expensive data management and storage solutions are clearly beyond the means of mid-sized retail outlets.  Yet the opportunities for real time, device and location-based marketing have increased dramatically for this segment of businesses. DV technology enables them to sidestep high investment data infrastructures, while leveraging increasingly affordable IoT technology.

The primary consideration for such firms is the upfront expense of memory hardware.  While memory can be expensive,  the ready availability of DV-as-a-Service is helping tackle this hump for SMEs. Cloud instances not only allow SMEs to pay as they go, but also helps them keep track of both consumed memory and the active memory need for potential caching, thereby ensuring optimal usage.

Financial Services

DV technology could apply to a host of financial sector use cases, but the steep increase in the quantity and depth of regulatory reporting is perhaps one of the most pressing. In the wake of new customer data protection laws,data sharing requirements, and central bank demands for granular financial data, there is now little room for highly manual processes. DV technology offers reporting teams easy access to raw data that can be quickly transformed to align to regulatory templates.

However, the benefits made possible by DV technology are diminished when the data is of poor quality, or the calculation logic to be used is fuzzy. It is also crucial that DV platforms are assessed for their data governance and data lineage capabilities in order keep track of how data is being aggregated.

And as the acceptance of DV technology grows further, industry players can look forward to a shift from generic DV-based applications,to ones designed for specific industry needs.