Practice made perfect

Join Our Talent Network

Big Data Architect in Knoxville, Tennessee, US at TeamHealth

Date Posted: 5/11/2020

Job Snapshot

Job Description

POSITION:

Big Data Architect M-F, 8AM – 5PM Full-time - 40 hrs/wk. 

EMPLOYER:

Ameriteam Services, LLC / TeamHealth

DUTIES:

The Big Data Architect provides technical expertise and takes a central role in the architecture, design and implementation of enterprise big data and analytics platforms. The Big Data Architect is expected to understand the relationship of data in all systems and guide the integration of that data for enterprise wide reporting and analytics leveraging big data technologies. Big Data Architect must be a productive member of a delivery and support team, for both new and established systems. Create, maintain and promote architecture and implementation standards for data systems for consistency and performance. Design, implement, and support data ingestion and consumption pipelines for enterprise data assets leveraging various Big Data technologies. Interview stakeholders and domain experts to determine data needs. Develop and communicate a BI data road map reflecting business priorities. Define and document big data architecture needs for enterprise data asserts, working with business and IT stakeholders throughout the enterprise. Augment business source data with supplemental statistics and results from analysis or predictive models. Educate BI developers and user community in the interpretation and use of that data. Understand data management security vulnerabilities and develop associated risk mitigation plans.

 

Job Requirements

REQUIREMENTS:

Bachelor’s degree in Information Technology, Computer Science, or Engineering (or foreign equivalent). 7 years experience in technology and data architecture with emphasis in the areas of Information Delivery and Business Intelligence. 7 years of experience with Architecting and Implementing Big Data Platforms with focus in Apache Hadoop (HDFS, Hive, Sqoop, Spark, Impala).  2 years experience with SQL and noSQL technologies. 2 years experience with Cloud Technologies such as Azure, AWS. All experience may be gained concurrently.

Alternatively, employer will accept: Master’s Degree in Information Technology, Computer Science, or Engineering plus 5 years experience in technology and data architecture with emphasis in the areas of Information Delivery and Business Intelligence; 5 years with Architecting and Implementing Big Data Platforms with focus in Apache Hadoop (HDFS, Hive, Sqoop, Spark, Impala); 2 years experience with SQL and noSQL technologies; 2 years experience with Cloud Technologies such as Azure, AWS.