Databricks Architect - Data Engineering
- Full-time
Company Description
Publicis Sapient is a digital transformation partner helping established organizations get to their future, digitally-enabled state, both in the way they work and the way they serve their customers. We help unlock value through a start-up mindset and modern methods, fusing strategy, consulting and customer experience with agile engineering and problem-solving creativity. United by our core values and our purpose of helping people thrive in the brave pursuit of next, our 20,000+ people in 53 offices around the world combine experience across technology, data sciences, consulting and customer obsession to accelerate our clients’ businesses through designing the products and services their customers truly value.
Job Description
Job Description
Publicis Sapient is looking for a Data Architect to join our team of bright thinkers and doers. You’ll use your problem-solving creativity to design, architect, and develop high-end technology solutions that solve our clients’ most complex and challenging problems across different industries. We are on a mission to transform the world, and you will be instrumental in shaping how we do it with your ideas, thoughts, and solutions.
Your Daily Duties and Impact:
- Work closely with our clients providing evaluation and recommendations of design patterns and solutions for data platforms with a focus on ETL, ELT, ALT, lambda, and kappa architectures
- Define SLAs, SLIs, and SLOs with inputs from clients, product owners, and engineers to deliver data-driven interactive experiences
- Provide expertise, proof-of-concept, prototype, and reference implementations of architectural solutions for cloud, on-prem, hybrid, and edge-based data platforms
- Provide technical inputs to agile processes, such as epic, story, and task definition to resolve issues and remove barriers throughout the lifecycle of client engagements
- Creation and maintenance of infrastructure-as-code for cloud, on-prem, and hybrid environments using tools such as Terraform, CloudFormation, Azure Resource Manager, Helm, and Google Cloud Deployment Manager
- Mentor, support and manage team members
Qualifications
- Demonstrable experience in enterprise level data platforms involving implementation of end-to-end data pipelines
- Hands-on experience in using Databricks
- Hands-on experience with at least one of the leading public cloud data platforms (Amazon Web Services, Azure or Google Cloud)
- Experience with column-oriented database technologies (e.g., Big Query, Redshift, Vertica), NoSQL database technologies (e.g., DynamoDB, BigTable, Cosmos DB, etc.) and traditional database systems (e.g., SQL Server, Oracle, MySQL)
- Experience in architecting data pipelines and solutions for both streaming and batch integrations using tools/frameworks like Glue ETL, Lambda, Google Cloud DataFlow, Azure Data Factory, Spark, Spark Streaming, etc.
- Metadata definition and management via data catalogs, service catalogs, and stewardship tools such as OpenMetadata, DataHub, Alation, AWS Glue Catalog, Google Data Catalog.
- Test plan creation and test programming using automated testing frameworks, data validation and quality frameworks, and data lineage frameworks
- Data modeling, querying, and optimization for relational, NoSQL, timeseries, graph databases, data warehouses and data lakes
- Data processing programming using SQL, DBT, Python, and similar tools
- Logical programming in Python, Spark, PySpark, Java, Javascript, and/or Scala
- Cloud-native data platform design with a focus on streaming and event-driven architectures
- Participate in integrated validation and analysis sessions of components and subsystems on production servers
- Data ingest, validation, and enrichment pipeline design and implementation
- SDLC optimization across workstreams within a solution
- Bachelor’s degree in Computer Science, Engineering, or related field
Qualifications
Your Skills & Experience:
- Exceptional data engineering skills with distributed computing background and proven experience in delivering large scale data platforms
- Good grasp of analytics, measurement, reporting, and business intelligence including modeling, insights generation, and data science
- Hands-on technologist with deep expertise in big data ecosystem for data integration, data storage, compute framework, analytics, and advanced visualization (i.e., ETL Tools, Streaming Tools, No-SQL data bases, Spark, Airflow, ELT tools like DBT, Reporting Tools, AI/ML Platforms)
- Hands-on experience with at least one of the leading public cloud data platforms (Amazon Web Services, Azure, or Google Cloud)
- Exposure to AWS EMR, Glue, Athena, S3, SQS/SNS or equivalent technologies in other cloud platforms • Expertise in data governance for big data space with knowledge of various MDM/entity resolution solutions
- Experience in Financial Services Data Engineering
- Experience in building & applying best practices w.r.t. performance, security, and cost-efficiency for data lake
- Ability to lead teams that rapidly learn the client’s current digital ecosystem and produce a future data landscape vision and strategy considering the transformation agenda and business goals
- Have a point of view and understanding of build vs. buy, performance considerations, hosting, commercial models, business intelligence, reporting, and analytics • Excellent client communication and facilitating skills, ability to influence others and gain consensus, and team collaboration
- Combination of proficient consulting, business, strategy, technical and people skills
Set Yourself Apart With:
- Certifications for any of the cloud services like Azure, AWS, or GCP
- Certifications for any Machine Learning/Advanced Analytics Courses
- Experience working with code repositories and continuous integration pipelines using AWS codebuild/code pipelines or similar tools/technologies
- Experience in data governance and lineage implementation
- Multi-geo and distributed delivery experience in large programs
Additional Information
Pay Range: $117,000-210,000
The range shown represents a grouping of relevant ranges currently in use at Publicis Sapient. Actual range for this position may differ, depending on location and specific skillset required for the work itself.
Benefits of Working Here:
- Flexible vacation policy; time is not limited, allocated, or accrued
- 16 paid holidays throughout the year
- Generous parental leave and new parent transition program
- Tuition reimbursement
- Corporate gift matching program
As part of our dedication to an inclusive and diverse workforce, Publicis Sapient is committed to Equal Employment Opportunity without regard for race, color, national origin, ethnicity, gender, protected veteran status, disability, sexual orientation, gender identity, or religion. We are also committed to providing reasonable accommodations for qualified individuals with disabilities and disabled veterans in our job application procedures. If you need assistance or an accommodation due to a disability, you may contact us at [email protected] or you may call us at +1-617-621-0200.