Skip to content

Decompose the UD Tools Server [Parent] #2260

@alexrichey

Description

@alexrichey
Image

TODO

  • Investigate Carto For the BIN/BBL querying in the C# app (or via DO functions)
  • Investigate what is actually being queried in the existing app (is it just PLUTO?)

Overall work to do

C# Rhino Plugin

  • C# plugin to query Carto for BIN/BBL

  • reconfigure endpoint for Rhino -> DO function. The job request should the the DO Function, which should return a Spaces blob storage location for the eventual file.

  • C# plugin should poll the blob storage until the job is complete (maybe it should check for the existence of a .finished file.)

  • C# plugin downloads the file when finished.

  • Add Rhino config for the serverless fn token

DO / Batch Job

DO App Batch jobs have a timeout of 30 min, which isn't long enough. Instead let's just trigger a GHA job until we have Airflow.

  • Let's make a lightweight python DO function to trigger GHA. It should also post the job details to the DO bucket
  • Add GHA that takes a job_id parameter (with which it can download the job details from DO) to run model generation and upload the file to DO. We'll just need to extract the model generation code from the Flask app.

MISC

  • ud-tools-core: don't push images to ACR. Push to Docker hub instead.
  • new DO bucket. Read only creds for users.
  • luckily, I didn't yet delete the UD tools data from our DO Postgres, so we can potentially switch back to using that.

Metadata

Metadata

Labels

No labels
No labels

Projects

Status

Backlog

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions