SEEKING EXPERIENCED DATA ARCHITECT
We're a data consulting & analytics business that offers clients services across the entire data ecosystem. We're located in Southern California and have a small team of great engineers who do excellent work and have a partnership approach with the clients we serve. Currently we have an immediate project requirement for a qualified Data Architect, please contact us if interested.
PROJECT:
• Immediate requirement for a current project, we need to get started asap
• 40 hrs/Week for a 3-month period, will likely extend to 40 hrs/week on-going
• Open to contract or contract-to-hire
WHAT WE WILL BE DOING:
• Data Architecture and Infrastructure - Help design and implement end-to-end data platforms, including the design and execution of microservices for data ingestion, data transformation, alerting, monitoring, and other key components of a data platform ecosystem.
• Data Modeling - physical data modeling for distributed, columnar warehouses as well as data modeling for transactional back-end services
• Data Pipelines - build end-to-end ETL/LT pipelines to populate Data Warehouses and empower downstream analytical data consumers
WHATE WE ARE LOOKING FOR:
• BS degree in Computer Science or related technical field, or equivalent practical experience
• 2+ years experience as a Data Architect and 5+ years proven work experience as a Data Engineer, working with at least one programming language (Python, Scala, Go) plus SQL expertise
• 5+ years experience developing data platform infrastructure and ETL/ELT design and implementation, including development best practices in testing, logging, monitoring
• Extensive experience implementing CICD practices and infrastructure as code (terraform) as a foundation for developing data platforms from the group up on cloud ecosystems
• Background working with distributed big data technologies such as Spark, Presto, Hive, etc. as well as data warehouse technologies including Snowflake, BigQuery, or Redshift
• Experience with implementing microservice architectures, event based processing, and streaming pipelines, including expertise with Kafka and Spark Streaming
• Extensive data modeling knowledge including in NoSQL, transactional dBs, columnar and row based distributed analytical data environments, as well as prior experience designing and implementing best practice Data Lakes
• Knowledge of agile software development and continuous integration / deployment principles