Backend
This is the backend stack consisting of three primary services (api, kg, mcp) and two libraries (rabbit, scholar) that communicate asynchronously over RabbitMQ:
api: FastAPI endpoints that service frontend queries and integrate with backend services.mcp: FastMCP service that provides MCP access to the API and context generation for external LLM hosts.kg: Primary service for creating embeddings, interacting with LLMs, and building visualizations.rabbit: Defines the messaging layer between services.scholar: Semantic Scholar integration.
Local Development
Development for the backend project can be done inside a devcontainer. The container will be automatically provisioned with a neo4j database, a rabbitmq instance, and the official Ollama container image.
- Copy the
.env.examplefile intoNexarag/.devcontainer/.env, modifying the values for your system - Open
Nexaragin VS Code - Press
Ctrl+Shift+Pand typeOpen Folder in Container
Jupyter Lab will be run automatically inside nexarag.dev. Use the python 3.11.11 kernel to run Jupyter notebooks.