Skip to content

docs: add GodelEDGE onboard satellite AI inference product profile#148

Open
vikashkodati wants to merge 3 commits intoelisa-tech:mainfrom
vikashkodati:add-godeledge-product-profile
Open

docs: add GodelEDGE onboard satellite AI inference product profile#148
vikashkodati wants to merge 3 commits intoelisa-tech:mainfrom
vikashkodati:add-godeledge-product-profile

Conversation

@vikashkodati
Copy link

Summary

Adds a product profile for GodelEDGE, an edge AI inference system for onboard satellite imagery processing, as a separate document per maintainer feedback on PR #141.

  • New file: docs/godeledge-product-profile.md
  • First space/LEO payload profile contributed to the working group
  • Covers all template fields at platform level
  • Includes Safety/Software Standards discussion

This profile was requested by @matthew-l-weber in response to an introduction on the SGL SIG mailing list (message #151).

Context

GodelEDGE runs computer vision models on COTS GPU hardware (NVIDIA Jetson Orin) in orbit, downlinking compact alert payloads instead of raw imagery. The profile describes what this class of AI/ML payload workload needs from the OS and platform layer, to help inform SGL requirements.

Godel Space website: https://godel.space
Founder: Vikash Kodati (vikash@godel.space)

Per Matthew Weber's suggestion on PR elisa-tech#141, moved the GodelEDGE product
profile to its own file (docs/godeledge-product-profile.md) instead of
appending to the shared product-profiles.md. Reverted product-profiles.md
to upstream state.

Vikash Kodati, Founder of Godel Space
Remove specific per-tier power budgets, inference latency targets,
model weight sizes, GPU memory budgets, and output payload sizes.
Keep information at the level needed for OS/platform requirements
without revealing competitive performance details.

Vikash Kodati, Founder of Godel Space
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant