Applying to this job will open a new window on the employer's web site to apply there.
Posted Date
5/19/2025
Description
Job Summary: We are seeking a strategic and technically proficient Director of Data Architecture and Engineering to lead the design, integration, and optimization of our modern data platform. The ideal candidate will have 8+ years of experience in data engineering, data architecture, and data management, with a proven track record of delivering scalable and high-quality data solutions that support the various data needs of the organization. Key Responsibilities: Lead the design and development of our modern data platform, by building innovative and scalable data solutions that are aligned to the enterprise standards and data strategy. Architect and implement scalable data pipelines for integrating data from various sources including databases, APIs, streaming data, and third-party platforms. Establish data integration patterns and frameworks, ensuring seamless data flow and interoperability across systems. Design and implement data models, data lakes, and data warehouses to support analytics and reporting. Collaborate with architects, engineering, product, and business leaders to align platform development with business objectives. Drive project execution throughout the entire software development lifecycle, mitigating risks and ensuring timely delivery. Ensure smooth running of data platform and data pipelines, and help with the triage and resolution of data issues Implement data standards, data quality frameworks, monitoring systems, and data governance best practices. Participates in new technology evaluations, identify alternative or new technologies and assist in defining new enterprise standards. Provide leadership and mentorship to a team of data engineers and architects, fostering a culture of innovation and excellence. Stay informed on emerging data technologies, tools, and integration frameworks to enhance data architecture and engineering practices. Qualifications and Skills: Bachelor’s or master’s degree in computer science, engineering, or related field. 8+ years of experience in software engineering, with at least 5 years in data architecture, data engineering, or data integration roles. Experience in data modelling, data warehousing and good understanding of various data models like relational/ODS, dimensional models and related concepts Strong knowledge of data architecture, data integration patterns, ETL frameworks, and data processing tools. Proficiency in Python, SQL, modern data platforms such as Databricks/Snowflake, relational databases (e.g., PostgreSQL, SQL Server, Oracle) Proven experience building batch and real-time data pipelines for diverse data sources (structured, unstructured, semi-structured, streaming). Extensive experience with data pipeline tools (e.g., Azure Data Factory, Spark, Glue). Good understanding of various big data and lake house table and file formats (Iceberg, Delta, Parquet, ORC) Strong understanding of cloud-based data storage and databases (e.g. AWS S3, Azure Data Lake, RDS, Dynamo DB) Knowledge of building RESTful APIs using serverless technology like AWS Lambda or Azure functions Knowledge of data streaming technologies (e.g., CDC, Kafka, Azure Event/IoT Hub, Kinesis). Experience working with code management and CI/CD tools for SDLC (e.g. GitHub, GitLab, SonarQube, Code build) Knowledge of standard IT security practices such as identity and access management, SSO, data protection, encryption, certificate, and key management. Knowledge of data governance, data security, and regulatory compliance. Excellent problem-solving and analytical skills, with the ability to design scalable data solutions. Excellent communication, technical writing and presentation skills. Demonstrated leadership experience managing cross-functional data architecture/engineering teams. Demonstrated ability to adapt to new technologies and learn quickly. Flexibility to work non-standard schedule as needed, including on-call hours as needed Preferred Qualifications: Certification in modern cloud platforms, data engineering, or data architecture. Proficiency in data workflow management and orchestration tools (e.g., Airflow, dbt, Dagster). The base salary for this role can range from $120,000 to $145,000 based on a full-time work schedule. An individual’s ultimate compensation will vary depending on job-related skills and experience, geographic location, alignment with market data, and equity among other team members with comparable experience Want to Learn More? [Our Values] [Our Benefits] [Our Community Impact] [Our Leadership] PURE Insurance is a property and casualty insurance company—think homes, cars, fine art and collections—designed exclusively for financially successful individuals and families. We're dedicated to delivering an exceptional experience to our members by alleviating stress, solving challenges and removing conflict, wherever possible, from the insurance process. We are deeply committed to fostering a work environment where everyone has an equitable chance to learn, develop and succeed, and where all feel welcome, safe, and supported to do their best work and bring their whole self to PURE. Our team is comprised of empathetic, passionate and curious individuals who are #PUREproud of the work we do and milestones we achieve together. We’re constantly looking for bright individuals with ambitions as high as our own to join our community and contribute to our journey. Joining PURE means creating your own journey, too. We encourage our team members to pursue their personal passions and provide them resources and support to see them come to fruition. Learn more about our culture on LinkedIn. Interested in PURE but don’t see a role that fits? Introduce yourself and we’ll get in touch if an opportunity opens up that seems like a good match.
Salary120,000.00 - 145,000.00 Annual