Getting an Amber Alert far and wide as soon as possible is critical to successfully recovering a child. In fact, pick any emergency event and getting information to the first responders who need it as soon as possible is critical to saving lives. All of these events are obviously time sensitive, and systems are needed which are architected to consume events and trigger the appropriate event processing. As events come fast and furious in a full-blown emergency, it is important that the system design reflects an ability to separate the event logging/recording from the event processing to avoid losing or dropping events.
There are many different event processing and notification systems, and it is unreasonable to expect organizations to adopt one standard. To be successful, any event system must be able to consume events from other event systems, and must also be able to push events into other event systems. The ability to “ripple” events across a disparate network of systems is what this is all about. It is left as an exercise for the reader to identify such a product. Think XML and the need to move data across as many as 300 different systems across all branches of government including justice, public safety, emergency and disaster management, intelligence, and homeland security.
Adding social media to the mix
In a public emergency, the power of social media cannot be ignored. We know how Twitter has been used by authorities to locate people in an emergency. Why not use social media to notify the public? It isn’t a big jump to view Twitter as a public event notification system. One could argue that each “tweet” is in fact an event. Using hashtags, one can spread a message far and wide. Tweets are small (like event notices) and support location. For some types of time-sensitive events, notifying the public is critical. It would be hard to come up with a more costeffective way than to simply embrace the existing infrastructure of social media.
So What is Needed?
The ability to share real-time data across agencies, data silos, non-compatible systems, sensors and SCADA devices, leveraging social media and mobile devices, presents significant challenges. Having access to a middleware platform that easily provides the integration and connectivity services required would also help reduce the complexity.
This does not factor in the need to put in place security frameworks to ensure that the data is not tampered with in flight and is only accessed by entitled users. This process is clearly complex and costly in the case of one information provider and one information consumer, but grows geometrically more complex and costly as the number of bi-lateral relations increases.
Worse, because the encoding and security measures must be coded into the applications, change management and testing become prohibitive. All of this sounds and is daunting. What is needed is an industry-wide sharing standard. A data sharing standard would eliminate much of the complexity, as would a middleware platform that can play the role of an Enterprise Service Bus that provides out of the box connectivity between these incompatible systems and data sources.
The National Information Exchange Model (NIEM) is a multi-agency framework which defines how to share data. To say that the vision of NIEM is ambitious is an understatement. However, the benefit if successful is massive, and could greatly increase the efficiencies of communication across and outside of government.
NIEM was launched in 2005 for the purpose of developing, disseminating and supporting information-sharing standards and processes across all branches of government including justice, public safety, emergency and disaster management, intelligence and homeland security. Through NIEM, government agency data silos can be bridged to facilitate the sharing of information between them.
NIEM provides a common language, a universal XML-based vocabulary and a framework by which state, local, tribal and federal government agencies may share data in both emergency and day-to-day operations. However, in practice NIEM data must be wrapped in a messaging protocol like SOAP for exchange across the Internet. Proper exchange therefore requires an ability to onboard/offboard NIEM-encoded data into a communications format suitable for the Internet — and then provide the framework to control how that communication is routed, secured and validated to prevent either transmission error or data compromise.
The interesting thing about NIEM is that this framework is not trying to dictate data models that all must use to participate, as other projects attempt to do. Rather, NIEM leaves the definition of these data models to experts through the definition of IEPD’s (Information Exchange Package Documentation).
This delegation of model definition through IEPD’s is very different from other approaches, and one would argue increases its chance for success.
NIEM — The data exchange protocol is XML
The number of systems that must interoperate with the data of NIEM is mind-boggling, encompassing everything from local law enforcement to 911 systems, to fire, to Amber Alert warnings.
Rather than define a new data encoding, the team behind NIEM chose an open standards approach that can be embraced by all. It should come as no surprise that the data exchange protocol for NIEM is XML!
When first looking at NIEM, one can get discouraged as the data model itself is absolutely massive with thousands of different data types! Looking closer it was evident that to perform a particular task — say “Amber Alert,” “Commercial Vehicle Collision” or “Track IEPD” — one only has to understand the classes that are relevant to what you are trying to do, and these are relatively easy for any domain expert to understand.
An Ideal Middleware Platform
Different organizations use different systems, and this is not going to change. Indeed, we do not want this to change, as it is important for all organizations to be able to adopt the system that does the best job for them and to be able to continue to innovate.
One might take a look at the Cameo E2E Bridge.