Who We Are We're building the future of real estate — today. HomeLight is the essential technology platform used by hundreds of thousands of homebuyers and sellers to partner with top real estate agents and win at any step of the real estate journey, whether that's finding a top agent, securing a competitive mortgage, or ensuring on-time, easy close. HomeLight facilitates billions of dollars of real estate on its platform every year. Our vision is a world where every real estate transaction is simple, certain, and satisfying for all. Our team breaks barriers every day while staying committed to HomeLight's goals and core values , which is a crucial element to our shared success. Who You Are We are building our Data Engineering team to tackle HomeLight's diverse, data challenges. This position is an excellent opportunity for an engineer that wants to own the development, optimization, and operation of our data pipeline, which collects, processes, and distributes data to a suite of HomeLight products and teams. You will provide mission-critical data to both our algorithms and internal users, refining our product and identifying new markets. What You'll Do Here Some projects you will work on: Optimize and execute on requests to pull, analyze, interpret and visualize data Partner with team leaders across the organization to build out and iterate on team, and individual performance metrics Optimize our data release processes, and partner with team leads to iterate on and improve existing data pipelines. Design and develop systems that ingest and transform our data streams using the latest tools. Design, build, and integrate new cutting-edge databases and data warehouses, develop new data schemas and figure out new innovative ways of storing and representing our data. Research, architect, build, and test robust, highly available and massively scalable systems, software, and services. What You Bring 5+ years of Python and ETL experience, preferably Airflow Experience writing and executing complex SQL queries Experience building data pipelines and ETL design (implementation and maintenance) Scrum/Agile software development process. Bonus points for Familiarity with chatgpt, transcription analysis Familiarity with AWS, Elasticsearch, Django Experience setting up and managing internal API services. Experience working on a small team, ideally at a startup. Familiarity with the Amazon AWS ecosystem **This is a permanent work from home opportunity from any of the LatAm region** Let's Chat!#J-18808-Ljbffr