USACares Jobs

Job Information

PSG Global Solutions Data Engineering Manager in Purchase, New York

The Opportunity

We're looking for a Data Engineering Manager , working in the Consumer Packaged Goods industry in Purchase, New York .

  • Maintains a predictable, transparent, global operating rhythm that ensures always-on access to high-quality data for stakeholders across the company.

  • Works on day-to-day data collection, transportation, maintenance/curation and access to the corporate data asset.

  • Works cross-functionally across the enterprise to centralize data and standardizes it for use by business, data science or other stakeholders.

  • Increases awareness about available data and democratizes access to it across the company.

  • Oversees the quality of data pipelines into and within the company and works with data architecture and data quality teams to structure high-quality data-at-rest and provisioning data for use and experimentation by various internal data customers.

  • Works in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems.

  • Works closely with the data science team to enable exploration of new data sources and the development of models to classify and bring meaning to unknown and unstructured data.

  • Supervises data ingestion and integration processes from data source systems into the enterprise data warehouse, data lake or other data storage and exploration tools (i.e., Databricks delta lake).

  • Professionalizes data engineering by developing standard management processes to oversee the lifecycle of ingress and egress data pipelines and customer data sandboxes.

  • Partners with IT, data architecture, and other teams on the administration and monitoring of all data platforms to ensure data is properly transported, harmonized and made accessible across key dimensions.

  • Works on performance tuning and optimization of all data ingestion and data integration processes, including data platform and databases.

  • Works cross-functionally with various teams and stakeholders to resolve data issues.

Our Client

Our client provides a variety of staffing services to the insurance and other industries. Demand is the strongest it has ever been. And they know what drives it.

Are your skills the strongest they have ever been? Let us connect you with this 40-year-old firm that gets your value. Small enough to care about you. Big enough to have long-standing relationships with companies that need your expertise. Positioned to put you to work.

Wherever you are in your career. They are there to meet you. Wherever you want to go. Let us help you figure out how to get there. To champion for you.

Experience Required for Your Success

  • Bachelor’s degree required

  • 6+ years work experience

  • 6+ years experience using SQL

  • Comfortable setting up and overseeing batch and API based data pipelines

  • Experience using data tools like MFT (Managed File transfer - Tibco, IBM MQ FTE), ETL (Extract Transform and Load – Informatica), Datastage, Alteryx, industrial data pipeline tools, etc.

  • Experience managing pipelines into a heterogenous data architecture/technology ecosystem

  • Experience with tools like Azure Data Lake (Analytics and Storage), Data Warehouse, Synapse Analytics, Amazon Redshift, Data Factory, Logic Apps

  • Familiarity with Hadoop-based technologies (HDInsight, Spark, Hive, Pig, etc.) preferred

  • Knowledge of a complementary scripting language (Python, R, Scala) preferred


The pay range we are offering is 140,000 to 160,000 per year.

What Do You Think?

Does your experience reflect what it takes to be successful in this role? Do the work and challenges get you excited about what's possible? Apply here .

Not exactly? Join Our Talent Community ( , and we'll let you know of additional opportunities.

EOE Protected Veterans/Disability