Overview

The existing client application is a cognitive cloud based platform designed to:

  • Be used by researchers and clinicians along with CxOs to mine medical records and stay on top of new publications by converting unstructured real world data into actionable insights.
  • Extracting high-value knowledge for further downstream research or treatment plans.
  • Connect to hundreds of multiple data sources
  • Build and publish complex data analysis pipelines
  • Visually produce insights from complex data.

Challenge

The end users however faced hurdles in:

  • Slow computation of data
  • Extraction of large volumes of unstructured clinical data
  • Delay in job status tracking in visual pipeline manager
  • Steep & continuous rise in data size
  • Legacy system unable to produce quick results to meet specific clinical and research requirements

Solution

Team CSS decided to conduct a detailed analysis that addresses these challenges. In the first phase of the solution design after deep analysis of technology, business and process we identified the need to do a in depth analysis. Following were the major activities conducted:

  • Technical evaluation of the legacy application
  • Conducting a fit-gap analysis
  • Identifying scalable components in the system
  • Creating a visionary product roadmap
  • Identifying the need to re-architect the existing solution
  • Defining components that require replacement with faster and scalable computation capabilities

Outcome

Post successful evaluation, discussion and brainstorming sessions, CSS provided a comprehensive analytical report. Following were the major outcomes of the new solution design:

  • The big data architecture provides vertical scalability to extract large volumes of clinical unstructured data
  • No horizons for increase in data size
  • Complete real time visibility of data leading to quick actionable insights to meet specific clinical and research requirements
  • The new architecture creates a roadmap for further downstreaming research and treatment plans
  • Maintaining computation speed irrespective of size of data helping researchers to stay on top of new publications and medical records.
  • Fast extraction of large volumes of data through multiple sources thus converting  unstructured real world data into actionable insights.
  • Design of required hardware to support the big data eco system