Job Url: https://jobs.lever.co/regrow.ag/c4c7961c-2fac-4746-8d0d-857b5fdac0a1 Job Description: 5+  years of experience building and deploying high-quality customer facing production data pipelines and services primarily in Python Experience working with raster and vector data structures, and proficient in  geospatial libraries/tools such as GDAL, Rasterio, Shapely, and QGIS. Skilled at manipulating multi-dimensional raster data and large-scale geospatial arrays (e.g., using NumPy, xarray, or Dask). Exposure to geospatial data standards, including OGC-compliant services (WMS, WFS, WCS) and SpatioTemporal Asset Catalog (STAC) specifications for organizing and querying data. Proven experience creating and deploying data processing workflows ( Airflow, Argo, Kafka, etc) that enable scalable, reproducible, and version-controlled geospatial data generation. Working knowledge of REST APIs with Python Web Frameworks such as FastAPI, Flask, Django, etc. Strong understanding of relational databases (PostgreSQL (with PostGIS), MySQL) and data warehouses like BigQuery, with proficiency in querying geospatial data and working with indexes, geometry types, and geospatial joins. Experience deploying apps/services on at least one major enterprise cloud platform (AWS, GCP, or Azure). Excellent English language presentation and communication skills Well-versed in Kubernetes and Docker (building and scheduling containers). Proficient in profiling, debugging, tracing, and or parallelizing/optimizing Python code.