Design, build, and manage large scale data structures and pipelines and efficient ETL work flows. Knowledge of Hadoop architecture, building of data pipelines. Uses SQL, Python or Java. Builds data marts and data models to support clients and customers.
-Experience with bash shell scripts, UNIX utilities & UNIX Commands. -Knowledge in Java, Python, Hive, Cassandra, Pig, MySQL or NoSQL or similar. -Knowledge in Hadoop architecture, HDFS commands and experience designing & optimizing queries against data in the HDFS environment. -Experience building data transformation and processing solutions. -Has strong knowledge of large scale search applications and building high volume data pipelines. At least 3 years experience preferred.
Meet Your Recruiter
Alma Hembree Recruiter
Burtch Works is a contingency and retained executive recruiting firm dedicated to placing highly-qualified quantitative professionals within a variety of specialties including:
Analytics | Data Science | Marketing Research | Consumer Insights | Web/Digital Analytics | Credit & Risk Analytics