McMaster Big Data Forum

Published August 1 2017 by Solene Castex
Back to insights

Internet Programming Technology Concept with laptop showing script and data process activity

Par : Nigel Fonseca

I recently had the pleasure of being a panelist at the inaugural McMaster Big Data Forum held this past September. McMaster University recognizes that big data is transforming the way people do business and sees collaboration between government, the private sector and the education system as a key factor in delivering innovative results.  This one day event brought in leading big data experts to discuss the challenges and key issues that are confronted with the use of data for innovation. Most panelists shared inventive case studies relating to the use of big data in the consumer retail and healthcare sectors. (http://www.braininstitute.ca/blog-entry/dementia-and-big-data)

I was a panelist for the small to medium enterprise (SME) perspective on “Using Data for Innovation”.  I see big data as a key component in delivering on the promise of using data for more comprehensive decision making.  The convergence of storage, processing power, bandwidth and open source technologies allows SME’s to make use of big data internally and provide valuable offerings to clients.  Our organization works with clients to integrate their existing data warehouses with big data. We are also leveraging late-binding capabilities inherent in big data technology on top of traditional data solutions. This allows customers to discover the “unknown unknowns”.  The Big Data Forum allowed participants to understand the challenges and opportunities within the big data environment and most importantly for this audience it illustrated how public private partnerships can lead to successful applications.

Contact us