One of the biggest challenges Big Data customers have is to be assured of the Return on Investment (ROI) in implementing Big Data infrastructures and asking the right questions-according to Suman Biswas, Chief Software Architect. “There is massive amount of data out there, but what exactly do you need – requires clear understanding of objectives, what are the key metrics – requires pin pointed areas of improvement, what problem does it solve – requires understanding the business domain, how does it make things better – requires foresight to see the solution work,’’ he says.
Big Data project delivery timelines are cluttered with a preamble consisting of the learning curve associated with the ‘sort of new’ and emerging Big Data technologies, leading to situations when time is not on your side.
The company has an impressive track record of modernizing legacy systems with innovative emerging Big Data technologies (Hadoop, NoSQL, advanced geo-mapping data). They have implemented scalable cloud hosted solutions that enable fast analytics on massive GIS data sets. Built on open source Hadoop ecosystem, these solutions enable extreme scale processing (peta bytes) to extract and disseminate meaningful information. “Our BI solutions atop these systems also offer simulation and predictive analytics with intuitive visualizations and dashboards,” he adds.
Different platforms, Big Data, small data and a range of software applications that specialize in specific business processes work in unison for unified organizational outcomes. Their custom architectural framework formulates structure and behavior between application packages, databases and middleware, by focusing on how they interact with each other. The framework comprises of flexible infrastructure that contain secure data lake, data transformation/ enrichment views and detailed visualizations accessible via web and mobile devices. Extending this framework to COTS methodology, they are able to provide faster, cheaper and better solutions.
NiyamIT has played a key role in transforming FEMA Risk Mapping Assessment and Planning program along with IBM from IT solution perspective. Over a period, Risk MAP program has generated over 250 TB of data that is not yet leveraged to its full potential. According to him, the company has replaced obsolete content manager and built a modernized ‘Engineering Library” with distributed restful query engine powered by elastic search software (on Hadoop file system). This enables users to locate necessary information/artifacts in milliseconds instead of minutes. In addition, they have integrated with data.gov to share Risk MAP data with other agencies/consumers as part of Government wide open data initiative. “We see a monthly usage of over 10 million hits for National Flood Hazard Layer (NFHL) GIS services, developed by us. We are currently working on a solution to bring over 240 TB of Coastal Analysis data into FEMA’s Engineering Library powered by Softlayer cloud and FASP based IBM Aspera technology.”
The company offers customers a 3-point program (Assess, strategize and implement), which is a key differentiator for organizations seeking competitive advantage, cost reductions and operational efficiencies through modernizations.
The firm is poised to ride on the analytics wave, predictive analytics to be specific. Their strategy for the next few years would be to cross new frontiers in Business Intelligence, by providing customers right beginnings all the way up to the top. This includes providing infrastructure capable of handling massive amounts of data and high-speed analytics, data management layer, data transformation and enrichment views that are tailor made on demand and a rich visualization layer that provides actionable intelligence.