Financial indicators and analytics for global public companies. Live at sponda.capital.
Smooths earnings volatility by averaging 10 years of inflation-adjusted net income. More reliable than single-year P/E for identifying overvalued or undervalued companies.
PE10 = Market Cap / Average Inflation-Adjusted Annual Net Income (10 years)
Same 10-year averaging approach, but using free cash flow instead of earnings. Reveals whether reported profits translate into real cash generation.
PFCF10 = Market Cap / Average Inflation-Adjusted Annual FCF (10 years)
FCF = Operating Cash Flow + Investing Cash Flow
Point-in-time ratio measuring how much of the company's capital structure is financed by debt relative to shareholders' equity. Uses the most recent balance sheet.
Dívida Bruta / PL = Total Debt / Stockholders' Equity
Debt source (Brazilian tickers): historical debt per balance sheet comes from BRAPI's balanceSheetHistory (loansAndFinancing + longTermLoansAndFinancing). For the most recent quarter, we additionally query BRAPI's financialData.totalDebt — a broader figure that also includes debentures, financial leases, and other interest-bearing obligations. When financialData.totalDebt is larger (or when balance-sheet loans are zero because BRAPI's raw fields are incomplete, as happens on many mid/small caps and banks), we override the latest row's total_debt with it. We never downgrade. If financialData reports totalDebt=None (typical of banks, whose liabilities are deposits rather than loans) the leverage card shows "not available" instead of a misleading zero.
Broader measure that considers all obligations (not just financial debt) relative to equity. Includes suppliers, taxes, provisions, etc.
Passivo / PL = Total Liabilities / Stockholders' Equity
Dual-panel chart showing historical adjusted prices alongside year-end P/L (Price/Earnings) and P/FCL (Price/Free Cash Flow) multiples. Helps visualize how a company's valuation has evolved over time relative to its stock price.
How it works:
- Fetches monthly historical prices from BRAPI (
range=max&interval=1mo) - Approximates shares outstanding as
market_cap / current_price - For each year with earnings/FCF data, calculates the year-end multiple:
(year_end_price × shares) / net_income(or FCF) - Years with negative earnings or FCF show as gaps in the chart
- Top panel shows adjusted prices (monthly); bottom panel shows the selected multiple (annual)
- Toggle between P/L and P/FCL with pill buttons
API endpoint: GET /api/quote/<ticker>/multiples-history/
- Trigram indexes (pg_trgm) on
Ticker.display_nameandsymbolfor sub-millisecond ILIKE search across 27K+ tickers - Composite indexes on
CompanyAnalysis(ticker, -generated_at),LookupLog(user, timestamp),LookupLog(session_key, timestamp) - PostgreSQL tuning for SSD + 2 GB RAM:
shared_buffers=512MB,work_mem=8MB,random_page_cost=1.1 - pg_stat_statements enabled for query performance monitoring
Three-layer caching strategy eliminates redundant external API calls:
Layer 1 · Provider cache (in providers.py): raw external API responses (BRAPI/FMP) are cached at the routing layer, so multiple views that need the same data (e.g. fetch_quote, fetch_historical_prices) share a single external call.
| Provider call | TTL |
|---|---|
fetch_quote |
15 min |
fetch_historical_prices |
1 hour |
fetch_dividends |
1 hour |
Layer 2 · View cache: computed results for each API endpoint.
| Endpoint | TTL | What it avoids |
|---|---|---|
| Ticker list (27K rows) | 1 hour | Full table scan on every page load |
| Search results | 2 min | Trigram query + sorting per keystroke |
| PE10 metrics | 4 hours | 6+ DB queries + external API call + inflation adjustment |
| Fundamentals | 6 hours | All balance sheets, earnings, cash flows + IPCA table + external API |
| Multiples history | 6 hours | 2 sequential external API calls (was 8s uncached) |
Layer 3 · Cache warming: python manage.py warm_cache pre-populates all three endpoints for the top 50 most-queried tickers. Run every 4 hours via cron so popular tickers are always served from cache.
- Search debounce at 300ms to reduce API calls during typing
- Dynamic imports via
next/dynamicfor CompanyMetricsCard, MultiplesChart (Recharts), CompareTab, FundamentalsTab, and CompanyAnalysis. Recharts (~100KB) only loads when the Charts tab is opened. - Prefetch on hover: hovering over Fundamentos or Graficos tabs triggers
queryClient.prefetchQuery(), so data is ready before the user clicks - Self-hosted Satoshi font: eliminates the 1.15s Fontshare external request
- 30-minute staleTime on React Query hooks; SSR revalidation at 1 hour
- Lazy-loaded images on all company logos; footer logo served via Next.js
<Image>with WebP optimization - useMemo on frequently recomputed derived state (excludeSet, sectorPeerLinks)
Locale-prefixed URLs serve region-specific metadata to search engines across all 7 supported locales (pt, en, es, zh, fr, de, it):
/pt/PETR4/fundamentos· Portuguese metadata,<html lang="pt-BR">, OG localept_BR/en/PETR4/fundamentals· English metadata,<html lang="en">, OG localeen_US/fr/PETR4/fondamentaux,/de/PETR4/fundamentaldaten, etc. follow the same shape- Bare URLs (
/PETR4) 302-redirect to the locale-prefixed version based onsponda-langcookie, thenAccept-Language - Every page includes
<link rel="alternate" hreflang>cross-links between all 7 locales plusx-default(English) - Tab URL paths are localized per locale (see
CANONICAL_TO_LOCALE_SLUGinfrontend/src/middleware.ts)
OG images are static JPEGs under frontend/public/images/:
sponda-og.jpg· Portuguese tagline, used for/pt/*URLssponda-og-en.jpg· English tagline, used for every other locale
getOgImageUrl(locale) in frontend/src/lib/metadata.ts selects the right image; both the homepage layout and generateTickerMetadata go through it. Only PT and EN images exist today because most crawlers cache a single OG image per URL and maintaining one per-locale wasn't worth the churn. If you need a new localized image, drop sponda-og-<locale>.jpg into public/images/ and extend the helper.
Two sitemaps are emitted; both advertise every URL in all 7 locales with full xhtml:link rel="alternate" hreflang alternates:
/sitemap.xml· Next.js, generated byfrontend/src/app/sitemap.ts(production source of truth — Nginx routes the root/sitemap.xmlto Next.js on port 3100)/api/sitemap.xml· Django, generated bySitemapViewinbackend/quotes/views.py(fallback / API consumers)
Both use shared constants: canonical tab keys (charts, fundamentals, compare) map to localized slugs in SITEMAP_TAB_SLUGS (backend) and tabSlugForLocale (frontend). Keep these in sync when adding a new locale.
- N+1 fix in AdminDashboard: replaced per-user loops with
annotate(Count(...))(1,200+ queries down to 1) - CompanyAnalysisView: 3 queries reduced to 1 with
.values()
The Compare tab on each company page lists up to 10 peer tickers ranked by how close they are to the source company. Ranking uses four tiers of signal, applied in order:
- Subsector within the same sector — companies whose business line maps to the same subsector as the source (e.g. VALE3 and GGBR4 both map to Mineração e Siderurgia, while KLBN4 maps to Papel e Celulose).
- Other subsectors in the same sector — fills remaining slots when subsector peers aren't enough.
- Adjacent sectors — only considered when the sector itself has too few candidates (see
ADJACENT_SECTORSinbackend/quotes/views.py). - Country, then market cap — within a tier, same-country peers come first; within same-country, larger market cap comes first.
Subsector inference is pattern-based: a per-sector list of regexes in SUBSECTOR_RULES (Finance, Non-Energy Minerals, Process Industries, Retail Trade, Transportation, Utilities, etc.) matches against the company name. Unmatched companies fall back to a default subsector label per sector. No schema change — the subsector is derived at query time.
API: GET /api/tickers/<symbol>/peers/
Company logos are served through GET /api/logos/<symbol>.png. The resolution chain is designed so that missing logos are recoverable without code changes:
- Manual overrides (
backend/quotes/logo_overrides.py::LOGO_OVERRIDE_URLS) — highest priority. Add"<SYMBOL>": "https://..."for any ticker whose auto-fetched logo is wrong or missing. - Ticker.logo URL from the database — skipped entirely if the URL is a known provider placeholder (e.g. BRAPI's generic
BRAPI.svg). Provider placeholders are also stripped at sync time inbrapi.sync_tickers. - BRAPI direct URL —
https://icons.brapi.dev/icons/<SYMBOL>.svg. - Generated fallback SVG — colored circle with the ticker's first letter. Never written to disk.
Real logos are cached to disk at LOGO_CACHE_DIR for 30 days. When all sources return placeholders or fail, the symbol is added to a 24-hour negative cache (in Redis) so subsequent requests don't re-hit the network.
Commands:
| Command | What it does |
|---|---|
./manage.py warm_logo_cache [--region ...] |
Pre-warm the disk cache for popular tickers. |
./manage.py audit_logos [--limit N] [--symbols ...] |
List tickers whose logo resolution ends in the generated fallback — use the output to populate LOGO_OVERRIDE_URLS. |
- Backend: Django 5 + Django REST Framework + PostgreSQL + Redis
- Frontend: React 19 + TypeScript + Next.js 15 + TanStack Query
- Styling: Tailwind CSS v4 (
@applyonly -- no utility classes in JSX) - Deploy: GitHub Actions CI/CD → DigitalOcean VPS
- Python 3.12+
- Node.js 20+
- A BRAPI API key
cd backend
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
# Create .env from template (edit with your BRAPI key)
cp ../.env.example ../.env
# Run migrations and start server
python manage.py migrate
python manage.py refresh_ipca # fetch IPCA data
python manage.py refresh_tickers # fetch B3 ticker list
python manage.py runservercd frontend
npm install
npm run devThe Vite dev server proxies /api requests to Django on localhost:8000.
| Variable | Purpose |
|---|---|
DJANGO_SECRET_KEY |
Django secret key |
BRAPI_API_KEY |
BRAPI pro API key |
DATABASE_URL |
PostgreSQL connection string (production only) |
ALLOWED_HOSTS |
Comma-separated allowed hosts |
DEBUG |
True for development, False for production |
Pushes to main trigger a GitHub Actions workflow that SSHs to poe.ma, pulls the latest code, rebuilds Docker containers, runs migrations, and restarts services.
ssh root@poe.ma
cd /opt/sponda
git pull
docker compose build
docker compose run --rm web python manage.py migrate --noinput
docker compose up -dThe blog at blog.sponda.capital is a Hugo static site living in blog/ in this repo. It serves flat HTML from nginx — no runtime, no database, no JavaScript.
cd blog
hugo new content/posts/2026-04-15-my-post.mdFrontmatter supports tags, categories, and an explicit slug (recommended when the title has accents):
---
title: "Exemplo"
slug: "exemplo"
date: 2026-04-15
tags: ["petrobras", "dividendos"]
categories: ["análise"]
---
Markdown goes here. YouTube embeds use Hugo's built-in shortcode:
{{< youtube dQw4w9WgXcQ >}}Commit and push to main; the deploy workflow builds the site on the server.
cd blog
hugo server
# open http://localhost:1313/blog/content/posts/· Markdown posts.blog/layouts/· custom DF-minimal HTML templates (no theme dependency).blog/assets/css/main.css· site CSS (fingerprinted and minified at build).blog/static/· favicon and fonts, copied verbatim to the output.blog/hugo.toml· site config.
Tags and categories auto-generate index pages at /tags/* and /categories/*. RSS feed is auto-generated at /index.xml.
Before blog.sponda.capital is reachable, the droplet needs:
- DNS:
Arecordblog.sponda.capital → 159.203.108.19. certbot --nginx -d blog.sponda.capital(after DNS propagates).ln -sf /etc/nginx/sites-available/blog.sponda.capital.conf /etc/nginx/sites-enabled/.nginx -t && systemctl reload nginx.
Hugo itself is auto-installed by the deploy workflow if missing — no manual apt install needed.
Every subsequent git push to main rebuilds and publishes automatically.
Systemd timers run periodic jobs. Each timer is installed and enabled automatically on deploy. To inspect:
systemctl list-timers --all # all timers, next/last run
journalctl -u sponda-refresh.service # last run logs for a unit| Command | Timer | Purpose | Frequency |
|---|---|---|---|
refresh_ipca + refresh_tickers |
sponda-refresh.timer |
Sync IPCA inflation index and B3 ticker list (~2,300 stocks) from BRAPI | Daily 06:00 UTC |
refresh_snapshot_prices |
sponda-refresh-snapshots.timer |
Quote-only refresh: updates market cap + current price + recomputes PE10 / PFCF10 / PEG / P/FCF PEG against existing fundamentals. One API call per ticker. | Daily 07:00 UTC |
refresh_snapshot_fundamentals |
sponda-refresh-fundamentals.timer |
Full refresh: resyncs quarterly earnings, cash flows, balance sheets, then recomputes the entire IndicatorSnapshot row. Four API calls per ticker. |
Weekly Sun 06:00 UTC |
check_indicator_alerts |
sponda-check-alerts.timer |
Evaluate user alerts against the latest snapshots; email on threshold crossings | Daily 07:30 UTC |
send_revisit_reminders |
sponda-revisit-reminders.timer |
Email users whose scheduled company revisits are due or overdue | Daily 11:00 UTC |
The reminder service is Type=oneshot with Restart=on-failure (up to 3 retries 120s apart) so a transient SMTP error doesn't silently drop a day of notifications. The timer is Persistent=true, so a missed run (e.g. server reboot) catches up on next boot. Long-running services (sponda, sponda-frontend) use Restart=always.
Unified error, performance, and cron monitoring through Sentry (free tier) plus UptimeRobot for external health checks. Full plan and rollout status: docs/observability-plan.md.
How it works
- Django + Celery.
config.observability.init_sentryruns fromsettings/base.py. It is a no-op whenSENTRY_DSNis unset, so dev and tests stay quiet.before_sendscrubsAuthorization,Cookie,Set-Cookie,DATABASE_URL, andSECRET_KEYfrom events. Integrations:DjangoIntegration,CeleryIntegration,LoggingIntegration(INFO breadcrumbs, ERROR-level events). - Systemd-timer commands. Subclass
config.monitored_command.MonitoredCommandand implementrun()instead ofhandle(). The base class captures any unhandled exception to Sentry and re-raises (so systemd still marks the unit as failed). Settingsentry_monitor_slugwraps execution insentry_sdk.crons.monitor, so Sentry Crons alerts you when a timer misses or fails. All six timer-invoked commands (refresh_ipca,refresh_tickers,refresh_snapshot_prices,refresh_snapshot_fundamentals,check_indicator_alerts,send_revisit_reminders) use this base. - Next.js.
@sentry/nextjsis wired up viasentry.client.config.ts,sentry.server.config.ts,sentry.edge.config.ts, all delegating tosrc/lib/sentry.ts.withSentryConfiginnext.config.tshandles source-map upload at build time. Session Replay: 10% of sessions + 100% of error sessions (within free-tier quota). - Request IDs.
config.middleware.request_id.RequestIDMiddlewareattaches a UUID to every request (or honors an inboundX-Request-ID, capped at 128 chars). The ID is echoed back in theX-Request-IDresponse header, tagged on the Sentry scope, and included in every JSON log line emitted during the request. - Structured logging.
config.logging_formatter.JSONLogFormatteremits one JSON object per log record (timestamp,level,logger,message,request_id,exception). Writes to stderr → captured by journald on production. No external log shipping yet; when we want it, point Promtail/Vector at the journal. - External uptime. UptimeRobot (free) hits
https://sponda.capital/andhttps://sponda.capital/api/health/every 5 minutes. Setup is manual, outside the repo.
Environment variables
| Name | Where | Purpose |
|---|---|---|
SENTRY_DSN |
backend .env |
Django + Celery DSN. Unset → Sentry is inactive. |
SENTRY_ENVIRONMENT |
backend | production / development. Defaults to development. |
SENTRY_RELEASE |
backend | Git SHA for release-tagged events. Optional. |
SENTRY_TRACES_SAMPLE_RATE |
backend | Perf trace sampling. Defaults to 1.0; lower when traffic grows. |
NEXT_PUBLIC_SENTRY_DSN |
frontend build | Browser DSN. Baked into the client bundle at build time. |
NEXT_PUBLIC_SENTRY_ENVIRONMENT |
frontend | Same semantics as backend, but client-side. |
NEXT_PUBLIC_SENTRY_RELEASE |
frontend | Client release tag. |
SENTRY_DSN_NEXTJS |
frontend runtime | DSN used by the Next.js Node + edge runtimes. Separate from Django's SENTRY_DSN so server-rendered and API-route errors reach the javascript-nextjs project. Falls back to SENTRY_DSN when unset. |
SENTRY_AUTH_TOKEN |
frontend build / CI | Source-map upload. Build succeeds without it, source maps just aren't uploaded. |
SENTRY_ORG, SENTRY_PROJECT |
frontend build | Target for source-map upload. |
Local testing
# Backend: tests run green with no DSN (init is a no-op).
cd backend && .venv/bin/pytest tests/test_observability.py tests/test_monitored_command.py tests/test_request_id_middleware.py tests/test_json_log_formatter.py
# Frontend: vitest covers the initSentry helper.
cd frontend && npx vitest run src/lib/sentry.test.ts
# End-to-end smoke (optional): export SENTRY_DSN=<dev-dsn> before running
# the dev server and trigger a 500 from any view to verify delivery.The screener page at /[locale]/screener lets users filter the whole B3 universe by any of the indicators shown on a company's main page and sort the results. Backed by a dedicated IndicatorSnapshot table so filtering and sorting are one DB query instead of recomputing indicators for every ticker on every request.
All are numeric min / max bounds (either side optional):
pe10,pfcf10· valuation multiples (10-year rolling)peg,pfcf_peg· growth-adjusted valuationdebt_to_equity,debt_ex_lease_to_equity,liabilities_to_equity,current_ratio· leverage / liquiditydebt_to_avg_earnings,debt_to_avg_fcf· debt vs. cash generationmarket_cap· absolute currency amount
- Snapshot table.
IndicatorSnapshot(one row per ticker) stores the latest value of every screened indicator. The table is kept current by a three-layer refresh strategy designed to respect BRAPI Pro and FMP Starter monthly budgets:- Persist-on-view. Any time a user opens a company page, the
PE10Viewendpoint writes the freshly computed indicators back intoIndicatorSnapshotand updatesTicker.market_capas a side-effect (wrapped intry/exceptso a write failure never breaks the page). This keeps actively viewed tickers perpetually fresh without any scheduled work. - Daily price refresh (
refresh_snapshot_prices, 07:00 UTC). For every ticker with a market cap, fetches the current quote (one API call) and recomputes only the price-dependent indicators — PE10, PFCF10, PEG, P/FCF PEG — against existing DB fundamentals. Leverage and debt-coverage fields are left alone. - Weekly fundamentals refresh (
refresh_snapshot_fundamentals, Sunday 06:00 UTC). Resyncs quarterly earnings / cash flows / balance sheets (three API calls per ticker) and then recomputes the full indicator set viacompute_company_indicators— the same service the company page uses, so the screener and the company page can never disagree. - Bootstrap.
sync_market_capsroutes Brazilian tickers through BRAPI and US tickers through FMP to backfillTicker.market_capfor rows that are missing it. Run once after adding new tickers; both refresh jobs skip tickers without a market cap.
- Persist-on-view. Any time a user opens a company page, the
- Query.
GET /api/screener/takes<field>_min/<field>_maxparams, asort(prefix-for descending; nulls always last),limit(max 500), andoffset. Returns{ count, results[] }. - Frontend. The
useScreenerhook (frontend/src/hooks/useScreener.ts) wraps the endpoint in React Query withstaleTime: 60s. The page isfrontend/src/app/[locale]/screener/page.tsx— sticky filter sidebar + results table with click-to-sort column headers and cursor-based "load more" pagination.
Example: GET /api/screener/?pe10_max=10&debt_to_equity_max=1&sort=-market_cap&limit=50 returns the 50 largest Brazilian companies with PE10 ≤ 10 and D/E ≤ 1.
Signed-in users can save thresholds on any screened indicator per ticker. When an indicator crosses a threshold, they get an email plus an on-screen entry at /[locale]/notificacoes.
- A small bell button sits next to each indicator label on the company page (
AlertButtoninfrontend/src/components/AlertButton.tsx). Click it to pick a comparison (≤or≥) and a threshold value. - Existing alerts for that (ticker, indicator) pair are listed inline so the popover is the single source of truth — no separate "manage alerts" page. Delete an alert with the
×button. - The
/notificacoespage has a Triggered alerts section above the revisit reminders; each row links back to the company and can be dismissed (which deletes the alert).
IndicatorAlert (in backend/accounts/models.py) holds user, ticker, indicator, comparison (lte / gte), threshold (Decimal), active, and triggered_at. The unique constraint (user, ticker, indicator, comparison) means a user can set both a floor and a ceiling for the same indicator, but not two overlapping alerts. model.clean() validates the indicator against IndicatorAlert.ALLOWED_INDICATORS — the same 11 fields the screener supports.
check_indicator_alerts (daily 07:30 UTC via sponda-check-alerts.timer, right after the snapshot refresh):
- Batch-loads every active alert's latest snapshot in one query.
- For each alert, compares the indicator value to the threshold using the stored comparison operator (
Nonevalues are skipped — no snapshot means no evaluation). - On a false → true transition sets
triggered_at = now()and sends one email per alert. Re-triggers only happen after atrue → falsereset, so users aren't spammed on consecutive runs while the condition holds. - Emails use Django's
send_mailwith a plain + HTML body (_build_alert_emailinbackend/accounts/tasks.py); the subject includes the ticker, indicator label, and threshold.
| Method | URL | Purpose |
|---|---|---|
GET |
/api/auth/alerts/ |
List current user's alerts. Optional ?ticker=PETR4 filter. |
POST |
/api/auth/alerts/ |
Create an alert: { ticker, indicator, comparison, threshold }. 400 on duplicates. |
PATCH |
/api/auth/alerts/<id>/ |
Update active, threshold, or comparison. |
DELETE |
/api/auth/alerts/<id>/ |
Delete. Scoped to owner — other users get 404. |
Tickers are uppercased on write; thresholds are DecimalField(max_digits=20, decimal_places=6) so precision matches the snapshot fields. Auth is session-based with CSRF (frontend/src/utils/csrf.ts::csrfHeaders).
Free tier allows 3 lookups per day, tracked by session cookie. After the limit, users are prompted to create an account.
Signed-up users can favorite companies to pin them on the home page grid.
- Unverified users are capped at 20 favorites total, and the home page renders only the first 8.
- Verified users (those who confirmed their email) have no cap — they can add unlimited favorites and every favorite shows on the home page grid.
The backend cap lives in accounts.views.FavoriteListView (MAX_FAVORITES = 20). The home page render logic lives in getHomepageTickers in frontend/src/components/HomepageGrid.tsx.
Users whose email is not verified see a notice on the account page (/[locale]/account) with a "Resend verification email" button. The button calls POST /api/auth/resend-verification/ (in accounts.views.ResendVerificationView), which re-sends the branded verification link via _send_verification_email. The endpoint requires an authenticated session and returns 400 if the email is already verified. The UI lives in EmailVerificationSection inside frontend/src/app/[locale]/account/page.tsx.
Welcome and email-verification messages are rendered in the new user's preferred language. The User.language field (accounts.models.User, one of pt, en, es, zh, fr, de, it, default en) drives template selection.
At signup the frontend (AuthModal.tsx and [locale]/login/page.tsx) sends the current UI locale as language in the POST body. If the field is missing, SignupView._parse_accept_language picks the highest-q supported locale from the Accept-Language header, falling back to en. Any later verification resend (/api/auth/resend-verification/, change-email flow) reuses the value stored on the user.
Templates live under backend/accounts/templates/emails/:
welcome_base.html/verification_base.html— shared HTML shell with{% block %}placeholders for every translatable string.welcome_<lang>.html/verification_<lang>.html— per-locale overrides (extendsthe base, fills blocks).welcome_<lang>.txt/verification_<lang>.txt— plain-text bodies per locale.
Subjects and localized share-link copy live in accounts/email_subjects.py. The sender (accounts.views._send_welcome_email / _send_verification_email) resolves the language via _resolve_language, renders the matching templates with render_to_string, and passes the localized subject.
To add a new locale: register it in SUPPORTED_LANGUAGES (accounts/models.py), add a row to both subject dicts and share_strings in email_subjects.py, and create the four template files (welcome_<lang>.html, welcome_<lang>.txt, verification_<lang>.html, verification_<lang>.txt).
