Apply now »

Senior Azure Data & Platform Engineer

Date:  Apr 28, 2026
Location: 

Remote, OH, US

Company Overview

 

Grounded by a history that is deeply rooted in innovation, Hexion is a global employer committed to building and protecting the future by producing innovative performance materials.  Our materials are the building blocks for critical industries, including construction, agriculture, energy, automotive, and infrastructure protection. Everywhere you look, you will find our materials and people at work to help customers make products that are stronger, safer, and cleaner. When you work for Hexion, you’ll join a team that is committed to operating safely and with integrity to build a more sustainable future for all, our associates, our customers, and the communities where we live and work.  

Position Overview

 

Hexion is seeking a senior, hands-on Azure data engineer to lead the design, implementation, and production support of enterprise data integrations. The role is centered on Azure Data Factory, Azure Logic Apps, and Databricks as well as other adjacent Azure data services to deliver governed, reliable, and maintainable pipelines. You will work closely with architecture, security, and infrastructure teams to apply the right guardrails to enable Hexion’s business objectives. 

Job Responsibilities

 

Azure Data Engineering & Integration 

  • Lead the design, development, and implementation of Azure Data Factory pipelines, Azure Logic Apps workflows and related Azure data solutions to meet business requirements. 
  • Collaborate with business, architecture, and technical teams to translate data requirements into Azure integration deliverables. 
  • Develop and enforce best practices for data pipeline development, ETL processes, data quality, governance, and documentation. 
  • Optimize, monitor, and troubleshoot existing pipelines to improve performance, reliability, and maintainability. 
  • Support integrations across Azure services such as Azure SQL Database, Azure Data Lake Storage, Azure Databricks, and Power BI. 

 

Data Platform Enablement (Azure) 

  • Coordinate with platform teams to onboard data workloads into approved Azure environments (subscriptions/resource groups), ensuring required standards (naming, tagging, logging) are met. 
  • Define repeatable environment and deployment patterns for data products (dev/test/prod), including configuration, secrets, and release boundaries for internal teams and vendors. 
  • Ensure prerequisite platform capabilities are available for data delivery (e.g., access to ADLS, Azure SQL, Databricks workspaces), partnering with owners to provision when needed. 
  • Apply governance guardrails for data workloads (policies/controls, logging, and cost visibility) and validate that deployments remain compliant over time. 
  • Monitor and optimize the cost/performance of data services (e.g., ADF, Databricks, storage) using tagging/chargeback practices, budget alerts, and right-sizing recommendations. 

 

Secure Data Access, Identity & Governance 

  • Define least-privilege access patterns for data services (ADF, ADLS, Azure SQL, Databricks) using Entra ID, managed identities, service principals, and RBAC—then work with administering teams to implement required changes. 
  • Implement secure secret handling (Key Vault), encryption, and credential rotation approaches for pipelines and integrations. 
  • Partner with network teams to meet private connectivity requirements (private endpoints, routing, firewall rules) for data sources and targets. 
  • Ensure data integrations are production-ready and auditable (logging, lineage/documentation where applicable) and aligned to enterprise security and governance requirements. 

 

Infrastructure Automation & Operations 

  • Implement Infrastructure-as-Code using Terraform and/or Bicep. 
  • Create and support CI/CD integration for both infrastructure and application or data deployments. 
  • Set up monitoring, logging, alerting, and basic break-fix support using Azure-native tools. 
  • Support disaster recovery planning and operational readiness for Azure resources and data services. 

 

Vendor Enablement & Delivery Leadership 

  • Support vendor onboarding into Azure environments, including access, permissions, deployment boundaries, and operational guardrails. 
  • Ensure external and internal teams can deploy and operate safely without impacting core enterprise workloads. 
  • Work closely with Hexion leads, architects, and vendors to unblock delivery and accelerate execution. 
  • Mentor junior team members and help drive engineering maturity, documentation quality, and continuous improvement across the environment. 

Minimum Qualifications

 
  • Bachelor's degree in Computer Science, Information Technology, or a related field, or equivalent practical experience. 
  • 7+ years of experience in data engineering, data integration, and production support (cloud and/or hybrid environments). 
  • Strong hands-on expertise with Azure data services and integration patterns (ADF, ADLS, Azure SQL, Databricks, Logic Apps). 
  • Proven experience designing and implementing Azure Data Factory pipelines and broader Azure data integration solutions. 
  • Strong understanding of ETL, data integration, data warehousing, and production support practices. 
  • Ability to partner with cloud platform, security, and networking teams to meet requirements for connectivity, identity, and controls needed by data workloads. 
  • Working knowledge of Entra ID and Azure IAM concepts (RBAC, managed identities, service principals) as they apply to securing data pipelines and services. 
  • Proficiency in SQL, Python, and PySpark. 
  • Strong troubleshooting, communication, and collaboration skills, with the ability to operate effectively in fast-moving environments. 

 

Required Technologies: 

  • Azure Data & Integration: Azure Data Factory (ADF), Logic Apps, Databricks, ADLS Gen2, Azure SQL, Power BI  
  • Languages: SQL, Python, PySpark  
  • Security/Governance: Entra ID, RBAC, Managed Identity, Key Vault  
  • DevOps/Operations (as applicable): CI/CD, Azure DevOps/GitHub Actions, monitoring/logging, Terraform/Bicep 

Preferred Qualifications

 

  • Experience with Azure SQL Database, ADLS, Databricks, and Power BI. 
  • Experience with Infrastructure-as-Code (Terraform and/or Bicep) and automating deployments for data platforms. 
  • Exposure to AI/ML, agentic AI, or automation-heavy Azure workloads. 
  • Experience supporting secure multi-vendor Azure environments. 
  • Familiarity with Azure monitoring, logging, and disaster recovery patterns. 
  • Experience implementing or operationalizing controls aligned to ISO/IEC 27018 (protection of personally identifiable information in public clouds) is a plus. 
  • Familiarity with SAP platforms (ECC R/3, S/4HANA, BW, and Datasphere), especially in the context of integrating SAP data into Azure. 
  • Azure certifications such as Azure Solutions Architect Expert. 
  • Databricks certifications such as Data Engineer Associate or Professional 
  • Databricks accreditations such as Databricks Fundamentals, Azure Platform Architect, or Platform Administrator. 
  • AWS experience is a plus. 

Success Measures

 
  • Azure data pipelines are designed and operating reliably in production. 
  • Delivery teams and vendors can release infrastructure builds safely using clear patterns for configuration, access, and deployment. 
  • Stakeholders receive timely, trustworthy data through integrations that meet agreed SLAs and operational expectations. 
  • Data quality, monitoring, and alerting are in place so issues are detected early and resolved quickly with clear ownership and runbooks. 
  • Documentation, repeatability, and delivery maturity improve over time, reducing friction between architecture, platform teams, and delivery teams. 

Other

 

We are an Equal Opportunity, Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to gender, pregnancy, race, national origin, religion, age, sexual orientation, gender identity, veteran or military status, status as a qualified individual with a disability or any other characteristic protected by law.

 

To be considered for this position candidates are required to submit an application for employment through our career site and, be at least 18 years of age.  Any offer of employment will be conditioned upon successful completion of a drug test and background investigation, as well as authorization for the Company to conduct additional periodic background checks as required by the Chemical Facility Anti-Terrorism Standards (CFATS) or regulations adopted by the department of Homeland Security or other regulatory agencies. A prior criminal record is not an automatic bar to employment, and the Company will conduct an individualized assessment and reassessment, consistent with applicable law, prior to making any final employment decision.


Nearest Major Market: Canton
Nearest Secondary Market: Akron

Apply now »