Business Intelligence Buyer's Guide

Public Service Announcement: Best Practice Guide for Big Data in the Public Sector

Public Service Announcement Best Practice Guide for Big Data in the Public Sector

Public Service Announcement Best Practice Guide for Big Data in the Public SectorAs one would expect, the private sector has been earlier adopters of big data solutions. The private sector has long been the first to invest in technology and with big data analytics they’ve been able to harness the benefits. Big data analytics gives companies a 360 view into their customer’s behaviors, patterns, interests, and feelings in regards to products and services. The public sector, on the other hand has been known to be a late entrance into new technology and for the most part have little adopted big data analytics.

+ Check Out a Free 2015 Business Intelligence Tools Buyers Guide

This is why I’d like to highlight the Australian public sector for taking the initiative for developing a best practices document for big data analytics called the Australian Public Service Better Practice Guide for Big Data. The report discusses the key issues government agencies face to implement integral big data solutions to their research, delivery, and reporting methods. Some of the issues covered in the guide include, but are not limited to privacy impact, third-party datasets, locating important datasets, data project management, use responsibility, and infrastructure.

The guide stresses that public service agencies must scrutinize what kind of infrastructure will best provide the analytics they require.  Below are some of the options discussed:

Cloud computing – which is fitted for internal applications that are loosely integrated or with undefined demand for processing and storage.

In-memory processing – which the storage of information in the random access memory (RAM) of dedicated servers (enabling real-time access) rather than in complicated relational databases operating on slow disk drives.

Supercomputing – which uses the system’s unique infrastructure to interpret highly dependent data and are employed for specialized applications that require immense amounts of mathematical calculations such as modeling and correlation.

Grid computing – is the collection of computer resources from multiple locations that handles multiple tasks at the same time without processor communication.  Usually associated with a scientific or technical problem that requires a great number of computer processing cycles.

In-database processing – which more quickly facilitates analytics by limiting the movement of data while processing is conducted.  This action provides for faster run times and is suited to operations in data discovery and exploration.

Lastly, because scalability is crucial to big data, the guide advises estimating system storage specifications by defining:

Performance criteria – or the amount of users, the type of queries, the speed of the task, the related storage configurations, and margin of error.

Compatibility requirements – or determining if the application needs to be coupled with other systems or existing data sources, or if it is a standalone development environment.

Extensible functions – or the ability to extend platforms without hindering overall capacity.

Click here to read the entire guide: Australian Public Service Better Practice Guide.

Insight Jam Ad

Share This

Related Posts

Insight Jam Ad

Latest Posts

Insight Jam Ad

Follow Solutions Review