Learn Apache Hadoop with Spark in One Day
by Rich Brueckner from High-Performance Computing News Analysis | insideHPC on (#1AMGH)
Hadoop and Spark clusters have a reputation for being extremely difficult to configure, install, and tune, but help is on the way. The good folks at Cluster Monkey are hosting a crash course entitled Apache Hadoop with Spark in One Day. "After completing the workshop attendees will be able to use and navigate a production Hadoop cluster and develop their own projects by building on the workshop examples."
The post Learn Apache Hadoop with Spark in One Day appeared first on insideHPC.