Job Overview
We are a global technology consultancy driving large-scale digital transformations for the Fortune 500. As a strategic partner to Google, we help enterprise clients navigate complex data landscapes migrating legacy systems to the cloud, optimizing costs, and turning raw data into executive-level insights.
We are seeking a Data Analyst who acts less like a technician and more like a Data Consultant. You will blend deep technical expertise in SQL and ETL with the soft skills required to tell compelling data stories to non-technical stakeholders.

Key Responsibilities

  • Strategic Cloud Data Architecture: Lead high-impact data migration projects. You will assess a client's legacy infrastructure and design the logic to move it to the cloud efficiently (focusing on scalability and security).
  • Cost and Performance Optimization: Audit and optimize cloud data warehouses (e.g., BigQuery, Snowflake, Redshift). You will use logical reasoning to hunt down inefficiencies, optimize queries, and restructure data models to save clients significant operational costs.
  • End-to-End Data Pipelines (ETL/ELT): Build and maintain robust data pipelines. Whether its streaming or batch processing, you will ensure data flows seamlessly from source to dashboard using modern frameworks.
  • Data Storytelling & Visualization: This is critical. You will build dashboards (Looker, Tableau, PowerBI) that don't just show numbers but answer business questions. You must be able to present these findings to C-suite clients with clarity and confidence.
  • Advanced Analytics: Apply statistical rigor and logical deduction to solve unstructured business problems (e.g., Why is our user retention dropping?).

Required Skills and Qualifications:
1. Core Competencies
  • Logical Reasoning: You possess a deductive mindset. You can break down ambiguous client problems into solvable technical components without needing hand-holding.
  • Advanced SQL Mastery: You are fluent in complex SQL (Window functions, CTEs, stored procedures) and understand how to write query-efficient code for massive datasets.
  • Communication Skills: You are an articulate storyteller who can bridge the gap between engineering jargon and business value.

2. Technical Experience
  • Cloud Proficiency: 3+ years of experience working within a major public cloud ecosystem (GCP, AWS, or Azure). Note: While we primarily use Google Cloud (BigQuery, Looker), we value strong architectural fundamentals over specific tool knowledge.
  • Data Engineering & ETL: Experience with big data processing tools (e.g., Spark, Apache Beam, Databricks, or cloud-native equivalents).
  • Visualization: Proven mastery of at least one enterprise BI tool (Looker, Tableau, PowerBI, Qlik).