Cloud Data Architect

Posted 2 months ago


Not disclosed
Not disclosed
Time commitment
Full time
Company size
Not disclosed


AWS Solutions Architecture
Amazon Redshift

Job description

A U.S.-based company that caters to musicians and music enthusiasts by operating a chain of musical instrument stores, is looking for a Cloud Data Architect. The selected candidate will have a strong leadership role in developing the Enterprise data architecture of the company’s digital and retail businesses across different brands. The company has well over 500 retail locations across the U.S. that sell new and used musical instruments, and offer instrument rentals/repairs, as well as lessons to musicians of all ages and abilities. This business has successfully raised over $30 Million through debt financing. The job role requires a 7-8 hour overlap with IST and some overlap with the PST time zones.
Job Responsibilities:
  • Assess and create prototypes of new concepts and technologies
  • Develop essential capabilities that can be handed off to operational development teams
  • Analyze complex problems and actively participate in enterprise technology design decisions
  • Take ownership of modernizing, migrating, and transforming the cloud data platform
  • Safeguard data by designing and building reliable and scalable data infrastructure with leading privacy and security techniques
  • Architect solutions that are scalable, secure, low latency, resilient and cost-effective to enable predictive and prescriptive analytics across the organization
  • Design and develop frameworks that can effectively operationalize ML models through serverless architecture while supporting unsupervised continuous training models
  • Take ownership of scaling the company data models including Tableau, DynamoDB, and Kibana
  • Analyze data to share data-backed findings with technical and non-technical internal and external stakeholders
  • Use data modeling best practices to build frameworks for data ingestion pipelines - both real-time and batch, ETL/ELT processes, and effectively hand them off to data engineers
  • Collaborate with peers and actively participate in critical technical decisions
  • Review code, and code implementations to share constructive feedback
  • Utilize your experience and research to make recommendations that can drive technology direction and choices of technologies
Job Requirements:
  • Bachelor’s/Master’s degree in Engineering, Computer Science (or equivalent experience)
  • At least 7+ years of relevant experience as a Cloud Data Architect
  • Experience working directly with enterprise data solutions (7+ years)
  • Practical exposure to a public cloud environment and on-prem infrastructure
  • Highly skilled in Columnar Databases such as - Redshift Spectrum
  • Proficient with Time Series data stores like Apache Pinot
  • Significant understanding of AWS cloud infrastructure
  • Demonstrable experience with in-memory, serverless, streaming technologies and orchestration tools such as Kubernetes, Airflow, Spark, and Kafka
  • Hands-on experience implementing IT platforms (7+ years)
  • AWS Certification with Big Data would be highly desirable
  • Previous experience implementing and designing AWS analytics and big data solutions in a large digital and retail environment would be preferred
  • Thorough understanding of and experience in online transactional processing (OLTP) and online transactional analytical processing (OLAP) databases, data lakes, and schemas
  • Hands-on experience with AWS Cloud Data Lake Technologies and operational experience with Kinesis/Kafka, S3, Glue, and Athena
  • Familiarity with Parquet, Avro, or ORC message/file formats
  • Experience designing and developing Streaming Services, EMS, MQ, Java, XSD, File Adapters, and ESB-based applications
  • Strong understanding of distributed architectures such as Microservices, SOA, RESTful APIs, and data integration architectures
  • Experience with Big Data Stack- Spark, spectrum, Flume, Kafka, Kinesis, etc
  • Understanding of modern Data streaming technologies like Kafka, SQS/SNS queuing, etc
  • Familiarity with Columnar databases like Redshift, Snowflake, Firebolt, etc
  • Knowledge of commonly used AWS services such as- S3, Lambda, Redshift, Glue, and EC2
  • Expertise in Python, pySpark, or similar programming languages
  • Understanding of BI tools like Tableau, Domo, and MicroStrategy
  • Familiarity with CI/CD concepts
  • Excellent English communication skills

How to get hired by clients?


Sign up

Create a profile by sharing with us your personal and professional details.


Take our online talent assessment for skills and competencies evaluation.


Get matched to in-demand jobs and accelerate your freelance career.

Interested in more opportunities like these?

Join now