When it comes to research, scientists often generate oceans of data, which can create challenges to capture, store, analyze and understand. Standard computer systems cannot handle what is known as "big data"-- high-volume, high-velocity data sets. The National Science Foundation, in a competitive grant process, has awarded North Dakota State University a $400,000 grant over three years to create a Data-Intensive Cyberinfrastructure for Research and Education at NDSU, Fargo.
- Massive data for miniscule communitiesWed, 1 Aug 2012, 20:02:53 EDT
- A 100-gigbit highway for scienceTue, 1 May 2012, 11:34:40 EDT
- Scientists seeking NSF funding will soon be required to submit data management plansMon, 10 May 2010, 11:34:53 EDT
- Information overload in the era of 'big data'Mon, 20 Aug 2012, 17:05:59 EDT
- International science community to establish global virtual library for scientific dataThu, 23 Oct 2008, 11:24:26 EDT