TL;DR

You do not need expensive data subscriptions to build profitable algorithmic trading strategies. This guide covers 25+ data sources available for free (or with generous free tiers) spanning market data, government filings, social sentiment, economic indicators, and alternative signals — with specific API endpoints, Python libraries, and rate limits for each.

Market Data

Market data is the foundation of any algorithmic trading system. Price history, volume, order books, and fundamentals are the raw inputs for technical and quantitative models. The four sources below cover equities, crypto, and forex without requiring a paid subscription.

1. Yahoo Finance

Yahoo Finance remains the single most useful free data source for retail algo traders. It provides historical OHLCV price data, real-time quotes (with a short delay), company fundamentals (P/E, EPS, revenue, market cap), analyst consensus ratings, dividend history, and options chains for US-listed equities.

How to access: Use the yfinance Python library. There is no official API, but the library wraps Yahoo's internal endpoints reliably.

import yfinance as yf

msft = yf.Ticker("MSFT")
hist = msft.history(period="1y")       # daily OHLCV
info = msft.info                        # fundamentals
options = msft.option_chain("2026-03-20")  # options chain

Update frequency: Price data is near-real-time (15-minute delay for free users). Fundamentals update daily.
Best use case: Technical indicator calculations (moving averages, RSI, MACD), fundamental screening, and backtesting on daily or weekly timeframes.
Limitations: No official API means Yahoo can change endpoints without notice. Rate limiting is undocumented but generally tolerates moderate usage. Not suitable for tick-level or sub-minute strategies.

2. CoinGecko

CoinGecko is the most comprehensive free crypto data aggregator, covering over 10,000 cryptocurrencies with real-time prices, market capitalization, 24-hour trading volume, circulating supply, and historical data going back to each coin's inception.

How to access: Free REST API at api.coingecko.com/api/v3/. No authentication required for the public tier.

# Fetch Bitcoin price history (last 30 days)
GET https://api.coingecko.com/api/v3/coins/bitcoin/market_chart?vs_currency=usd&days=30

Update frequency: Prices update every 1–2 minutes. Historical data available at daily granularity.
Best use case: Crypto price feeds, market dominance ratios (BTC dominance as a risk-on/risk-off indicator), and volume anomaly detection across altcoins.
Limitations: Free tier is capped at 30 API calls per minute. No websocket streaming — you must poll. Delayed compared to direct exchange feeds.

3. Binance Public API

Binance's public API provides real-time order book depth, recent trades, candlestick (kline) data, and 24-hour ticker statistics for every trading pair on the exchange. No authentication is required for public market data endpoints.

How to access: REST API at api.binance.com. Websocket streams available at wss://stream.binance.com:9443.

# Fetch BTC/USDT 1-hour candles
GET https://api.binance.com/api/v3/klines?symbol=BTCUSDT&interval=1h&limit=100

# Stream real-time trades via websocket
wss://stream.binance.com:9443/ws/btcusdt@trade

Update frequency: Real-time via websocket. REST endpoints update continuously.
Best use case: Crypto order book analysis, real-time trade flow, and high-frequency crypto strategies.
Limitations: Public endpoints are rate-limited to 1,200 requests per minute (weight-based). Only covers assets listed on Binance. No equities or forex.

4. Alpha Vantage Free Tier

Alpha Vantage provides stock and forex price data, company fundamentals (income statements, balance sheets, cash flow), and over 50 technical indicator calculations via API. The free tier is limited but functional for daily strategies.

How to access: Register for a free API key at alphavantage.co. REST API with JSON responses.

# Fetch daily adjusted prices for AAPL
GET https://www.alphavantage.co/query?function=TIME_SERIES_DAILY_ADJ&symbol=AAPL&apikey=YOUR_KEY

Update frequency: Daily for fundamentals. Intraday data available at 1, 5, 15, 30, and 60-minute intervals.
Best use case: Forex data, fundamental analysis, and as a fallback when Yahoo Finance is unavailable.
Limitations: Free tier is capped at 25 API calls per day (not per minute — per day). This is the most restrictive free tier on this list. You will need to batch and cache aggressively.

Government and Regulatory

Government data sources are unique in algorithmic trading because they are authoritative, structured, and often contain signals that commercial data providers charge thousands of dollars to repackage. Every source in this section is genuinely free, funded by taxpayers, and updated on a predictable schedule.

5. SEC EDGAR

The Securities and Exchange Commission's EDGAR database contains every public filing made by US-listed companies: 10-K annual reports, 10-Q quarterly reports, 8-K material event disclosures, Form 4 insider transactions, and 13-F institutional holdings. Insider buying clusters are one of the most historically reliable bullish indicators in equity markets.

How to access: Full-text search API at efts.sec.gov/LATEST/search-index. EDGAR filing feeds at sec.gov/cgi-bin/browse-edgar. Rate limit: 10 requests per second (with a required User-Agent header identifying your application).

# Search for recent 8-K filings mentioning "acquisition"
GET https://efts.sec.gov/LATEST/search-index?q=%22acquisition%22&dateRange=custom&startdt=2026-02-01&enddt=2026-02-27&forms=8-K

Update frequency: Filings appear within minutes of submission. Form 4 insider transactions must be filed within 2 business days of the trade.
Best use case: Insider transaction tracking, earnings surprise detection from 8-K filings, and institutional holdings analysis from 13-F filings (filed quarterly).
Limitations: Raw filings are in SGML/HTML/XBRL formats and require parsing. The data is unstructured compared to commercial providers. High-frequency Form 4 monitoring requires careful rate-limit management.

6. FRED (Federal Reserve Economic Data)

Maintained by the Federal Reserve Bank of St. Louis, FRED is the most comprehensive public database of US economic data — over 800,000 time series covering interest rates, inflation (CPI, PCE), employment, GDP, money supply, credit spreads, housing, and international economics.

How to access: Register for a free API key at fred.stlouisfed.org. Python library: fredapi.

from fredapi import Fred
fred = Fred(api_key='YOUR_KEY')

# Federal funds rate
ffr = fred.get_series('FEDFUNDS')

# 10-Year Treasury yield
t10y = fred.get_series('DGS10')

Update frequency: Varies by series. GDP is quarterly. CPI and employment are monthly. Interest rates are daily.
Best use case: Macroeconomic regime detection (growth vs. recession, rising rates vs. falling rates), yield curve analysis, and inflation-adjusted return calculations.
Limitations: Most series have publication lag (e.g., GDP is released ~30 days after the quarter ends). The API key registration is manual but takes under two minutes.

7. Congressional Trading (Capitol Trades)

Under the STOCK Act of 2012, members of the US Congress must publicly disclose their securities transactions. Academic research has documented that congressional portfolios have historically outperformed the S&P 500, making these disclosures a legitimate alternative data source.

How to access: Disclosures are published at efdsearch.senate.gov (Senate) and disclosures-clerk.house.gov (House). Third-party aggregators like Capitol Trades and Quiver Quantitative provide structured feeds.
Update frequency: Members have up to 45 days to report. Disclosures are published in batches, typically weekly.
Best use case: Identifying stocks with unusual buying activity by committee members who may have informational advantages related to upcoming legislation or regulatory action.
Limitations: The 45-day reporting delay means these signals are lagging indicators, not real-time. Dollar amounts are reported in ranges (e.g., $1,001–$15,000), not exact figures.

8. Treasury Direct

The US Treasury Department publishes daily yield curve rates, auction results, and outstanding debt data. The yield curve — the spread between short-term and long-term interest rates — is one of the most watched macroeconomic indicators in finance. An inverted yield curve has preceded every US recession since 1970.

How to access: Daily yield curve data at home.treasury.gov/resource-center/data-chart-center/interest-rates. Machine-readable XML/CSV downloads are available.
Update frequency: Daily, after market close.
Best use case: Yield curve shape analysis (normal, flat, inverted) for macro regime detection and bond market signals.
Limitations: No real-time intraday yield data. Auction results are published after the auction closes, not before.

9. Bureau of Labor Statistics (BLS)

The BLS publishes the most market-moving economic indicators in the US: Consumer Price Index (CPI), unemployment rate, Non-Farm Payrolls (NFP), Producer Price Index (PPI), and Average Hourly Earnings. NFP and CPI releases routinely move the S&P 500 by 1–2% in minutes.

How to access: Public API at api.bls.gov/publicAPI/v2/timeseries/data/. Registration gives you 500 queries per day (v2). Unregistered access allows 25 queries per day (v1).

# Fetch CPI-U (All Items) for 2025-2026
POST https://api.bls.gov/publicAPI/v2/timeseries/data/
{"seriesid": ["CUUR0000SA0"], "startyear": "2025", "endyear": "2026"}

Update frequency: Monthly. Release dates are published a year in advance on the BLS release calendar.
Best use case: Pre-positioning around major economic releases. Comparing actual vs. consensus estimates to predict post-release price moves.
Limitations: Data is backward-looking and subject to revision. The initial release is a preliminary estimate; revised figures come 1–2 months later.

News and RSS

News headlines move markets faster than any other data type. A single earnings surprise, FDA decision, or geopolitical event can shift a stock 5–10% in seconds. RSS feeds are the simplest way to ingest news programmatically without paying for a terminal subscription.

10. Reuters RSS

Breaking financial news from one of the world's two major wire services. Reuters covers equities, currencies, commodities, and central bank policy with a focus on factual reporting.

How to access: RSS feeds at reuters.com/arc/outboundfeeds/. Parse with any RSS library (feedparser in Python).
Update frequency: Continuous. New stories appear within minutes of events.
Best use case: Real-time headline sentiment analysis. Cross-reference with price data to identify news-driven moves.
Limitations: RSS feeds provide headlines and summaries only, not full articles. Some content may be behind authentication.

11. CNBC RSS

Market-focused headlines covering earnings reports, analyst upgrades and downgrades, sector rotation, and Federal Reserve commentary. CNBC's editorial focus on US equities makes it particularly useful for stock-specific signals.

How to access: RSS feeds at cnbc.com/id/100003114/device/rss/rss.html (top news) and similar endpoints for specific sections.
Update frequency: Continuous during market hours. Reduced volume after hours.
Best use case: Named entity extraction — identifying companies, executives, and products mentioned in headlines and mapping them to tradable symbols.
Limitations: CNBC has an editorial bias toward generating engagement, which can amplify noise. Weight headlines by source credibility in your pipeline.

12. AP News RSS

Geopolitical events, regulatory actions, natural disasters, and macro-level developments. AP's reporting is factual and low-noise compared to editorial outlets, making it a reliable input for broad market sentiment models.

How to access: RSS feeds at apnews.com (check specific section feeds for business and economy).
Update frequency: Continuous.
Best use case: Detecting macro-level events (sanctions, trade policy changes, government shutdowns) that affect broad market direction rather than individual stocks.
Limitations: AP covers all news, not just financial. You will need keyword filtering to extract market-relevant stories.

13. Finviz

Finviz is a stock screener and news aggregator that consolidates headlines from dozens of financial news sources into a single per-ticker news feed. It also provides a visual heat map of market sectors and prebuilt screeners for technical and fundamental criteria.

How to access: The website at finviz.com can be scraped for news feeds and screener results. There is no official free API. Python library: finvizfinance.

from finvizfinance.quote import finvizfinance
stock = finvizfinance('AAPL')
news = stock.ticker_news()  # recent news headlines

Update frequency: News aggregation is near-real-time. Screener data updates after each trading session.
Best use case: Rapid news aggregation by ticker. Useful for checking what headlines are driving a specific stock's movement.
Limitations: Finviz's terms of service restrict automated scraping. Use sparingly and respect their infrastructure. The free version shows delayed quotes.

Social Sentiment

Retail trader sentiment often precedes or amplifies price moves. Social platforms capture this sentiment in real time, and natural language processing can extract actionable signals from the noise.

14. StockTwits

The largest social network dedicated to stock market discussion. Users tag posts with specific ticker symbols and self-label their sentiment as bullish or bearish. This structured data makes StockTwits uniquely useful for quantitative sentiment analysis.

How to access: Free API at api.stocktwits.com/api/2/streams/symbol/{SYMBOL}.json. No authentication required for basic endpoints.

# Fetch recent messages for TSLA
GET https://api.stocktwits.com/api/2/streams/symbol/TSLA.json

Update frequency: Real-time. New messages appear within seconds of posting.
Best use case: Rolling sentiment ratio (bullish vs. bearish posts) and message volume spikes. A sudden volume surge on a low-activity ticker often precedes a price move.
Limitations: Sentiment labels are self-reported and can be gamed. Message quality varies widely. Best used as one signal among many, not in isolation.

15. Reddit (Free Tier)

Reddit hosts the largest communities of retail traders: r/wallstreetbets (15M+ members), r/stocks, r/investing, and r/options. Post frequency, upvote velocity, comment volume, and ticker mention counts all provide measurable signals.

How to access: Reddit API with OAuth credentials. Register an application at reddit.com/prefs/apps. Python library: praw.

import praw
reddit = praw.Reddit(client_id='...', client_secret='...', user_agent='...')
for submission in reddit.subreddit('wallstreetbets').hot(limit=25):
    print(submission.title, submission.score)

Update frequency: Real-time. API allows 100 requests per minute on the free tier.
Best use case: Detecting momentum events when a ticker mention count spikes from its baseline (e.g., 10 mentions/day to 500).
Limitations: Requires OAuth setup. Rate limits can be restrictive for monitoring multiple subreddits. Post-2023 API changes reduced free-tier access. Sentiment analysis requires NLP — unlike StockTwits, posts are not pre-labeled.

16. Hacker News API

Hacker News (news.ycombinator.com) is the dominant forum for the technology industry. Front-page posts about companies, products, and IPOs often coincide with increased trading activity in related stocks. The API is fully open with no authentication required.

How to access: Firebase API at hacker-news.firebaseio.com/v0/. Documentation at github.com/HackerNews/API.

# Fetch top stories
GET https://hacker-news.firebaseio.com/v0/topstories.json

Update frequency: Real-time. Story and comment IDs are available immediately.
Best use case: Tech sector sentiment. IPO buzz detection. Early signal on product launches, outages, or security incidents that affect tech stock prices.
Limitations: Audience is heavily tech-focused. Not useful for non-tech equities, forex, or commodities. Requires keyword matching to map discussions to tradable tickers.

Economic Calendars

Knowing when high-impact economic data will be released is as important as the data itself. Economic calendars let you reduce position sizes ahead of volatile events and avoid opening new trades when a market-moving number is minutes away.

17. Investing.com Economic Calendar

A comprehensive calendar of global economic events with impact ratings (low, medium, high), previous values, consensus forecasts, and actual results. Covers FOMC decisions, NFP, CPI, PMI, retail sales, and central bank meetings worldwide.

How to access: The calendar is available at investing.com/economic-calendar/. No official free API, but the data can be accessed programmatically using the investpy Python library or careful scraping.
Update frequency: Events are scheduled months in advance. Actual vs. consensus results are published within seconds of the release.
Best use case: Event-driven risk management. Reduce or close positions ahead of high-impact releases. Identify trading windows with lower event risk.
Limitations: No official API. Scraping may violate terms of service. Use cached data where possible.

18. ForexFactory Calendar

A forex-focused economic calendar used widely by currency traders. ForexFactory excels at filtering events by currency pair and impact level, and its community provides context and historical analysis for each event.

How to access: Calendar at forexfactory.com/calendar. Data can be parsed from the HTML or accessed via community-built scrapers.
Update frequency: Updated in real time as events are released.
Best use case: Forex strategies that need to avoid trading around central bank decisions, employment data, or GDP releases for specific currencies.
Limitations: No official API. The website is not designed for programmatic access. Consider caching the weekly calendar on Monday and updating intraday only for actual results.

Alternative Data

Alternative data captures signals that traditional market data and news cannot. Search trends, developer activity, page view spikes, and prediction market odds provide an informational edge that is still underexploited by retail traders.

19. Wikipedia Pageviews API

Academic research has demonstrated that spikes in Wikipedia page views for publicly traded companies correlate with increased trading volume and can predict short-term price volatility. When a company's page jumps from 500 views/day to 50,000, something is happening.

How to access: Wikimedia REST API at wikimedia.org/api/rest_v1/metrics/pageviews/. Fully open, no authentication required.

# Daily page views for "Tesla,_Inc." article
GET https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia/all-access/user/Tesla,_Inc./daily/20260101/20260227

Update frequency: Daily, with a 24-hour lag.
Best use case: Attention-based signals. Detect when public interest in a company spikes before it shows up in traditional news sentiment.
Limitations: 24-hour lag makes this unsuitable for intraday strategies. Page view data can be noisy around holidays and Wikipedia editing events.

20. GitHub API

For publicly traded technology companies that maintain open-source projects, GitHub activity metrics — commit frequency, contributor count, issue velocity, and release cadence — serve as a proxy for development momentum. Declining activity in a core product's repository can be an early warning signal.

How to access: REST API at api.github.com. Unauthenticated: 60 requests/hour. Authenticated (free token): 5,000 requests/hour.

# Fetch recent commits for a repository
GET https://api.github.com/repos/microsoft/vscode/commits?per_page=30

Update frequency: Real-time. Commit and release data is available immediately.
Best use case: Tracking development activity for companies like Microsoft (VS Code, TypeScript), Google (TensorFlow, Kubernetes), and Meta (React, PyTorch). Correlate release cadence with earnings.
Limitations: Only applicable to companies with significant open-source presence. Many publicly traded companies have minimal or no public GitHub activity. Does not capture proprietary development.

21. Google Trends (via pytrends)

Google Trends measures search interest for any term on a normalized 0–100 scale. A sudden increase in search volume for a ticker symbol or company name frequently precedes a large price move — earnings surprises, FDA approvals, acquisition rumors, and scandals all generate search spikes before the impact is fully priced in.

How to access: No official API, but the pytrends Python library provides reliable programmatic access.

from pytrends.request import TrendReq
pytrends = TrendReq()
pytrends.build_payload(['NVDA', 'AMD'], timeframe='now 7-d')
df = pytrends.interest_over_time()

Update frequency: Hourly for "now 7-d" timeframe. Daily for longer periods.
Best use case: Relative attention comparison between competing companies (e.g., NVDA vs. AMD). Detecting breakout interest before it hits mainstream news.
Limitations: Data is relative, not absolute — a score of 100 means peak interest for that specific query, not a fixed number of searches. Google rate-limits aggressive usage and may return 429 errors.

22. Polymarket API

Polymarket is a prediction market where users trade binary contracts on real-world events. The platform covers Fed rate decisions, election outcomes, regulatory actions, and macroeconomic scenarios. Prediction markets have historically outperformed polls, surveys, and expert forecasts for event probability estimation.

How to access: Public API at gamma-api.polymarket.com. No authentication required for market data.

# Fetch active markets
GET https://gamma-api.polymarket.com/markets?active=true&limit=20

Update frequency: Real-time. Odds update continuously as users trade.
Best use case: Forward-looking macro indicators. If Polymarket shows a 90% probability of a rate cut at the next FOMC meeting, your strategy can position accordingly before the announcement.
Limitations: Market liquidity varies. Low-volume markets may have wide spreads and unreliable odds. Limited history for backtesting.

23. DeFi Llama

DeFi Llama tracks Total Value Locked (TVL) across decentralized finance protocols on all major blockchains. TVL is the DeFi equivalent of assets under management — a rising TVL during falling prices suggests accumulation, while falling TVL during rising prices signals distribution.

How to access: Open API at api.llama.fi. No authentication required. Full documentation at defillama.com/docs/api.

# Fetch TVL for all protocols
GET https://api.llama.fi/protocols

# Fetch historical TVL for a specific chain
GET https://api.llama.fi/v2/historicalChainTvl/Ethereum

Update frequency: Every 30 minutes.
Best use case: Crypto macro analysis. Capital flows between chains and protocols indicate where smart money is moving in the DeFi ecosystem.
Limitations: TVL can be inflated by recursive lending and token price changes. Not all protocols report accurately. Use as a directional signal, not a precise measure.

24. Unusual Whales (Free Tier)

Unusual Whales tracks options flow, dark pool activity, and institutional positioning across the US options market. Large options trades — especially sweeps in short-dated, out-of-the-money contracts — often signal informed positioning ahead of catalysts.

How to access: Free tier at unusualwhales.com provides basic flow alerts and a daily summary. API access requires a paid plan, but the free tier dashboard is useful for manual monitoring and building intuition.
Update frequency: Real-time during market hours.
Best use case: Detecting unusual options volume that may indicate informed institutional activity. Put/call ratio analysis for sentiment.
Limitations: The free tier is limited in scope and history. Full programmatic access requires a subscription. Options flow signals have a high false-positive rate when used in isolation.

25. Quandl / Nasdaq Data Link (Free Datasets)

Now rebranded as Nasdaq Data Link, Quandl hosts hundreds of free curated financial datasets including commodity futures, economic indicators, and alternative data tables. Some of the most useful free datasets include the WIKI prices table (historical US equity prices) and the FRED mirror.

How to access: Register for a free API key at data.nasdaq.com. Python library: nasdaqdatalink (formerly quandl).

import nasdaqdatalink
nasdaqdatalink.ApiConfig.api_key = 'YOUR_KEY'
data = nasdaqdatalink.get('FRED/GDP')  # US GDP via FRED mirror

Update frequency: Varies by dataset. Most free datasets update daily or on the underlying source's release schedule.
Best use case: Clean, structured access to datasets that would otherwise require scraping or manual downloads. Useful for backtesting with consistent historical data.
Limitations: Many of Quandl's most valuable datasets (including some that were previously free) have moved behind paywalls under Nasdaq ownership. Check dataset availability before building dependencies on specific tables.

How to Combine Multiple Sources

No single data source is sufficient for a robust trading strategy. The most resilient algorithmic systems combine signals across categories — market data for timing, fundamentals for direction, sentiment for momentum confirmation, and alternative data for edge — and weigh them according to their historical predictive accuracy for each asset class.

The key principle is signal aggregation with dynamic weighting. A naive approach treats every signal equally: if StockTwits is bullish, FRED shows a strong economy, and insider buying is elevated, all three count the same. A better approach measures each source's predictive power over a rolling window and adjusts its contribution to the final signal accordingly. Sources that have been accurate recently get more weight. Sources that have been noisy get less.

This is the approach that slmaj takes. Its ML ensemble ingests data from 25+ sources and dynamically weighs each one based on recent performance. The ensemble does not require you to manually tune weights or decide which sources matter — the model handles that automatically. If a source degrades or becomes unreliable, it is down-weighted in real time. For more detail on how this works, see the How It Works page and the full data sources reference.

If you are building your own system, start with a simple equal-weight voting scheme across 3–5 sources, then gradually introduce performance-based weighting as you collect enough data to evaluate each source's contribution.

Getting Started

You do not need all 25 sources to begin. Start with a focused stack and expand as you learn what works for your strategy and asset class.

For equities: Begin with Yahoo Finance (price data and fundamentals), one RSS feed (Reuters or CNBC), and StockTwits (sentiment). These three sources are free, require no API keys, and cover the core signal categories: price, news, and sentiment.

For crypto: Start with CoinGecko (price data), Binance public API (order book), and DeFi Llama (TVL flows). All three are free and cover the most important crypto-specific signals.

For macro/forex: Begin with FRED (economic indicators), the BLS API (employment and inflation), and ForexFactory (economic calendar). These sources provide the fundamental data that drives currency and interest rate markets.

Once you are comfortable with your initial stack, add sources from the Government and Alternative Data categories. SEC EDGAR insider transactions and Google Trends are high-value additions that require minimal setup.

When you are ready to connect your data pipeline to a live broker, see the IBKR setup guide for a step-by-step walkthrough of connecting to Interactive Brokers with paper trading enabled by default.

Frequently Asked Questions

Do I need all 25 sources to start trading?

No. Most successful algorithmic strategies use 3–5 data sources. The value of this list is in helping you choose the right combination for your specific strategy and asset class. Start with one source from each category (market data, news, sentiment) and add more as you identify gaps in your signal pipeline. Adding more sources only helps if they provide genuinely independent information.

Are free data sources reliable enough for live trading?

For daily and swing trading strategies, yes. Yahoo Finance, SEC EDGAR, FRED, and CoinGecko are all production-grade sources used by institutional and retail traders alike. The main risk with free sources is availability: endpoints can change, rate limits can tighten, and services can experience downtime. Mitigate this by building redundancy into your pipeline (e.g., use Alpha Vantage as a fallback for Yahoo Finance) and implementing data quality checks that detect stale or anomalous data before it reaches your signal model.

How do I handle rate limits?

Three techniques work for every source on this list. First, cache aggressively — most fundamental and economic data does not change intraday, so there is no reason to fetch it more than once per day. Second, batch requests where the API supports it (FRED, BLS, and Alpha Vantage all accept multiple series in a single call). Third, stagger your polling intervals so that all sources do not fire simultaneously. If you are hitting rate limits with these practices in place, you are likely polling too frequently for your strategy's actual needs.

Which sources work best together?

The strongest combinations pair a quantitative source with a qualitative one. For US equities: Yahoo Finance (price/technicals) + SEC EDGAR (insider transactions) + StockTwits (sentiment). For crypto: CoinGecko (price) + DeFi Llama (on-chain fundamentals) + Google Trends (attention). For macro strategies: FRED (economic data) + Treasury Direct (yield curve) + Polymarket (forward-looking event probabilities). The key is signal independence — sources that capture different types of information produce better ensemble predictions than sources that measure the same thing twice.