These days new data is coming at us from all kinds of sources, sensors and devices. Traditional systems may have a hard time keeping up with the flow of information, making it hard for the end-user to use data for real-time decisions.
Join geospatial information systems technologists, data analytics professionals and the developer community after work for this free in-person training and networking meetup to talk about how GIS technology can help deliver real-time data.
You will learn how to:
- Incorporate real-time information streams with your existing GIS data and IT infrastructure.
- Perform continuous processing and analysis against streaming data.
- Produce new streams of data that work seamlessly with ArcGIS.
DATE: Thursday, June 16
TIME: 6:00 – 8:00 PM
LOCATION: 1776, 1133 15th St NW (12th floor), Washington, DC 20005
6:00 – 6:30 PM: Registration & Networking Happy Hour
Network with local GIS, developer and data analysis professionals while enjoying hearty appetizers and an open bar.
6:30 – 7:15 PM: Speaker Presentations
Mark Stevens, Director of Communications-GDS, Oceaneering International will discuss how they are using GIS with their current vessel collection.
Suzanne Foss, Product Engineer, Real-Time GIS, Esri will share new and upcoming enhancements to ArcGIS for big data analytics, including processing high-velocity real-time data streams into dynamic summary maps using continuous analytics, as well as performing batch analysis like aggregations, hot spots, and data mining on large volumes of geospatial information.
7:15 – 8:00 PM: Networking Happy Hour and Technology Demonstrations
Enjoy more drinks and appetizers while networing and participating in hands-on technology demonstrations of Esri’s redesigned ArcGIS GeoEvent Extension.
The GeoEvent Extension lets you bring in massive data sets, stream analytics, detect incidents, and send alerts, while capturing features to the spatial temporal Big Data Store. The Big Data Store also takes advantage of distributed computing technology so that you can write up to 10,000 features a second and get past previous bottlenecks.