Data Cloud Platform Architect (AWS)

Porto(Híbrido)

Data Cloud Platform Architect (AWS)

Timestamp Group aggregates several leading Portuguese IT solutions and services companies around the concepts of excellence and knowledge sharing. We are committed to technological leadership, based on the quality of our service and technological solutions, supported by continuous training and certification.


Responsibilities:

  • Demonstrate deep expertise in AWS data services, including Lambda, Glue, Step Functions, and Redshift, with exposure to Azure data services. 
  • Lead the implementation and enforcement of data governance and security standards across cloud platforms. 
  • Oversee the implement Infrastructure as Code (IaC) solutions using Terraform and CloudFormation. 
  • Lead the optimisation projects, ensuring minimal disruption and maximum efficiency.
  • Apply hands-on coding skills in Python and SQL to prototype solutions and establish coding standards. 
  • Develop robust solution architectures with a focus on scalability, performance, security, and cost optimisation. 
  • Design efficient data models and optimise query performance for large datasets. Manage ETL processes and data integration into Redshift, DuckDB, and PostgreSQL.
  • Set up and manage logging and tracing mechanisms in AWS using services such as CloudTrail and X-Ray. 
  • Implement orchestration solutions using Apache Airflow and AWS Step Functions. 
  • Utilise Athena for interactive query analysis of large datasets in Amazon S3. 
  • Provide technical leadership and act as a subject matter expert in cloud data engineering. 
  • Write comprehensive solution and technical documentation. Stay updated on emerging technologies and industry trends. 
  • Challenge business requirements and propose innovative solutions for efficiency and performance improvement.

    Main skills:
  • Deep expertise in AWS data services, with exposure to Azure data services.
  • Extensive experience with Infrastructure as Code (IaC) using Terraform and CloudFormation.
  • Proven ability to define and enforce data governance and security standards.
  • Demonstrated experience leading large-scale data migration and optimisation projects.
  • Strong programming skills in Python and SQL, with experience in prototyping and setting coding standards.
  • Experience with Iceberg tables and managing large datasets efficiently.
  • Proficiency in designing scalable and efficient data solutions on AWS, following best practices for cloud architecture and infrastructure.
  • Experience with orchestration tools such as Apache Airflow and AWS Step Functions.
  • Knowledge of ETL tools and experience working with large volumes of data, with a preference for experience with Kafka.
  • Proactive approach to production monitoring and troubleshooting.
  • Excellent communication and teamwork skills, with the ability to provide technical leadership and mentorship.
  • Strong analytical and problem-solving skills, with the ability to analyse requirements and propose innovative solutions.
  • Experience in writing solution documents and technical documentation.
  • Familiarity with Azure Databricks for data engineering and analytics tasks is an advantage.

Join us to challenge complexity with Intelligence!

We are an equal opportunity employer and welcome applications from all qualified individuals, regardless of race, ethnicity, religion, gender, sexual orientation, disability, age, or other protected status.

For more information about our Statement on Diversity, Equality, and Inclusion (DEI)

Copiar enlace