What are the responsibilities and job description for the Data Acquisition Engineer | News Media | Princeton, NJ position at Andiamo?
About the Opportunity
We’re hiring a Data Acquisition Engineer to join a high-impact team focused on acquiring and transforming real-time news and financial content at scale. This role is part of a core engineering function that powers mission-critical systems across a global information platform.
As a Data Acquisition Engineer, you’ll design and build systems that ingest content from a wide range of internal and external sources, enabling downstream teams to access enriched, accurate, and actionable data in real time.
This is a hybrid role based out of Princeton, NJ, requiring 3 days onsite each week. It offers the opportunity to collaborate with some of the brightest minds in data engineering, product, and machine learning.
📍 Location: Princeton, NJ (Hybrid – 3 days onsite per week)
🕒 Type: Full-Time | Direct Hire
🔍 Focus: Web Crawling · API Integrations · Real-Time Data Pipelines
What you’ll do:
- Design and maintain web crawlers and scraping systems to acquire large volumes of structured and unstructured content.
- Integrate with third-party APIs and feeds to capture breaking news and data updates in real time.
- Build and support data enrichment pipelines that classify, tag, and normalize incoming content.
- Monitor, troubleshoot, and improve performance of acquisition systems in production.
- Work cross-functionally with product managers, data scientists, and infrastructure engineers to continuously improve system flexibility, speed, and reliability.
Who you are:
- You’re passionate about clean data pipelines and fast-moving systems.
- You love digging into unstructured sources and turning them into usable, structured data.
- You thrive in hybrid environments and enjoy collaborating face-to-face with your team several days a week.
What you'll bring:
- 3–5 years of hands-on experience in software or data engineering roles.
- Proficiency in Python, including experience building scalable scripts, services, or crawlers.
- Strong experience with web crawling, scraping frameworks, and handling edge cases like JavaScript rendering or rate limiting.
- Experience working with RESTful APIs, data feeds, and integration of external data sources.
- Solid background in data enrichment, content classification, or pipeline optimization.
- Familiarity with Linux environments, Git, and basic scripting tools.
- Knowledge of both SQL and NoSQL data stores.
- Strong communication skills, especially when translating complex ideas to non-technical audiences.
Bonus Points For:
- Prior experience in news, media, or financial data domains.
- Familiarity with tools like Scrapy, Selenium, or Playwright.
- Understanding of content metadata, taxonomies, or machine learning integration into pipelines.
This is an excellent opportunity to join a fast-paced, highly collaborative environment where your work will directly impact the delivery of real-time data to global users.
Apply now directly or forward your resume to dt.thapar@andiamogo.com & mention 'Data Acquisition Engineer' in the Subject. If you’re excited to work on systems that demand speed, scale, and precision, this is your chance!!
Salary : $75 - $115