Publicis Sapient is seeking a skilled Cloud Architect to join our team. In this role, you will play a hands-on part in driving the architecture, design, and implementation of Snowflake for our clients. Additionally, you will utilize your cloud expertise in business development and pitch activities.
- Play a key role in delivering data-driven interactive experiences to our clients
- Forge close relationships with clients to understand their needs and effectively translate them into technological solutions.
- Provide expertise as a technical resource to solve complex business issues that translate into data integration, processing, analytics and storage designs
- Employ problem-solving skills to resolve issues and eliminate obstacles throughout the entire lifespan of client engagements
- Design and implement scalable and streamlined data pipelines to address key business use cases and to meet data SLA requirements.
- Ensuring all deliverables are high quality by setting development standards, adhering to the standards and participating in code reviews
- Participate in integrated validation and analysis sessions of components and subsystems on production servers
- Mentor, support and manage team members
- Expertise in Snowflake concepts like setting up resource monitors, RBAC controls, virtual warehouse, query performance tuning, Zero copy clone, time travel and understand how to use these features
- Expertise in Snowflake data modeling, ELT using snowpipe, implementing stored procedures and standard DWH and ETL concepts
- Proven track record of successfully building data driven solutions using integration, big-data processing and database and storage technologies
- Experience establishing data quality processes, performing data analysis, participating in technology implementation planning, implementing data integration processes
- Experience in architecting data pipelines and solutions for both streaming and batch integrations using tools/frameworks like Glue ETL, Lambda, Google Cloud DataFlow, Azure Data Factory, Spark, Spark Streaming, etc.
- Experience in Data Migration to Snowflake cloud data warehouse
- Experience in developing DBT models to transform data into useful, actionable information
- Expertise in designing ETL frameworks with several years of hands on experience in implementing several aspects of data pipelines including ingestion, transformation, data quality checks, monitoring, alerts, notifications etc.
- Experience with enterprise cloud economics
- Understanding of enterprise data management concepts (Data Governance, Data Engineering, Data Science, Data Lake, Data Warehouse, Data Sharing, Data Applications)
- Hands-on expertise with SQL and SQL analytics
- Industry benchmarking experience in major industries such as: Financial Services, Retail, Travel and Health
- Must Have Skills: DBT, Snowflake, SQL, Python
- Nice to Have Skills: Airflow, AWS Data analytics services, Scala, PySpark, Great Expectations.
Compensation Range :$110k-$160k