Software Engineer II – Streaming Data at AT&T in Portland
Skip to Main Content
We've got the data, you bring the insight. Image: woman smiling.

Software Engineer II – Streaming Data

Portland, Oregon


Xandr’s ad platform has one of the world’s largest collections of digital, film and tv properties. We developed technology that powers the real-time sale and purchase of Digital Advertising. Our platform is engineered to provide one of the fastest, most reliable, and massively scaled advertising systems in the industry.

About the team

The Deal Metrics Team is responsible for developing and maintaining the mission critical reporting and analytics backbone of the Xandr Monetize and Xandr Invest Platforms. As a Software Engineer II, you will be responsible for creating and maintaining the high-throughput data pipelines that power the reporting and analytics data used by customers to make well-informed decisions on their spend on our platform(s). These pipelines are powered by data streams from our Real-Time Auction Platform at incredible scale and transformed into actionable reports. The reports are used to troubleshoot issues, optimize, analyze performance, and help customers maximize monetization and spend. 

About the role
This role offers the opportunity to own and build the streaming data pipelines that give our customers insights into the performance of their advertising business.

As a Software Engineer, you will:

  • Work with high-throughput data streams (Gigabytes/Sec) from by our real-time auction platform
  • Develop data pipelines that refine and extract data into actionable reports used by customers
  • Implement strategies that ensure the quality, performance, and accuracy of reports
  • Ensure high availability of data pipelines and reports as platform requirements scale 

  • 2+ years of experience as a Software Developer
  • Experience building software in Java or other JVM-based languages
  • 2+ years demonstrated work experience developing data pipelines and/or working with streaming technologies such as RabbitMQ or Kafka
  • 2+ years of experience working with analytical data sets +/- 10 TB, OLAP paradigm, and OLAP databases such as Vertica or Snowflake
  • Strong experience with and knowledge of object-oriented coding, primarily Java or other low-level and/or back-end programming languages
  • Experience working within Kubernetes Container Environment
  • Strong SQL skills with the ability write SQL Aggregation Queries at a high-level
  • Strong experience writing well-tested code, deploying code safely, and working in a team with coding standards, including unit testing, functional testing of applications, working with build systems such as Jenkins, Code Review, Design Review, etc.

Nice to Have Skills

  • Experience with Samza
  • Familiarity with Protobuf
  • Experience within Ad-Tech space

Job ID 2045779X-1 Date posted 10/30/2020

Big Data Intern


Good experience overall. Colleagues were very helpful. Atmosphere was chilled out and there was no rush to complete the project. Was given ample amount of time to understand the project and contribute.


People are laid back and don't take initiative to do something new or optimize existing stuff. It's not primarily engineering company so if you know how to talk, you will go far longer in career.

Current Employee - Principal Business Manager
  • One Star Rating
  • Two Star Rating
  • Three Star Rating
  • Four Star Rating


This is the life – the #LifeAtATT, that is. We’re creating what’s next and having a blast doing it. You’re looking for proof? Well, see for yourself.

Back to top