Delivery

How to access the Autobound Signal Database via Google Cloud Storage.

We provision read access to Google Cloud Storage (GCS) buckets. You authenticate with a service account we provide and pull data on your own schedule.

Available Buckets

Each signal type has its own dedicated bucket. Access each bucket directly by its URI.

SEC Filings

BucketDescription
gs://autobound-10k-v1/SEC 10-K annual filings
gs://autobound-10q-v1/SEC 10-Q quarterly filings
gs://autobound-20f-v2/SEC 20-F foreign company filings
gs://autobound-6k-v2/SEC 6-K foreign company reports
gs://autobound-8k/SEC 8-K current reports
gs://autobound-earnings-transcripts-v2/Earnings call transcripts

Social & Web Signals

BucketDescription
gs://autobound-linkedin-post-company-v2/LinkedIn posts (company-level)
gs://autobound-linkedin-post-contact-v3/LinkedIn posts (contact-level)
gs://autobound-linkedin-comments-contact-v1/LinkedIn comments (contact-level)
gs://autobound-glassdoor-company-v2/Glassdoor company reviews
gs://autobound-reddit-company-v2/Reddit mentions (company-level)
gs://autobound-twitter-company-posts/Twitter/X posts (company-level)
gs://autobound-twitter-contact-posts/Twitter/X posts (contact-level)
gs://autobound-youtube-company/YouTube activity (company-level)
gs://autobound-youtube-contact/YouTube activity (contact-level)

Company Intelligence

BucketDescription
gs://autobound-news-v3/News signals
gs://autobound-hiring-trends/Hiring trends
gs://autobound-hiring-velocity-v1/Hiring velocity
gs://autobound-employee-growth-v1/Employee growth signals
gs://autobound-github-v1/GitHub activity
gs://autobound-product-reviews-v1/Product reviews (G2)
gs://autobound-patents/Patent filings
gs://autobound-seo-traffic/SEO & traffic signals
gs://autobound-website-intelligence-v1/Website intelligence
gs://autobound-work-milestones-v2/Work milestones
gs://autobound-financials/Financial data
gs://autobound-tech-used/Technology stack
gs://autobound-intent/Intent signals

Reference Data

BucketDescription
gs://autobound-company-database/Company database
gs://autobound-contact-database/Contact database
gs://autobound-manifests/Data manifests

Bucket Structure

Each bucket contains timestamped folders. Each folder has two files: output.jsonl and output.parquet.

gs://autobound-news-v3/
├── 2026-01-31-17-30-00/
│   ├── output.jsonl
│   └── output.parquet
├── 2025-12-31-17-30-00/
│   ├── output.jsonl
│   └── output.parquet
└── ...

Pull from the most recent folder to get the latest data.

Authentication

  1. We provide you with a GCP service account JSON key file
  2. Set the environment variable: export GOOGLE_APPLICATION_CREDENTIALS="/path/to/key.json"
  3. Use gsutil to access the buckets
Example: Listing and downloading files
# List folders in a bucket (access each bucket directly by URI)
gsutil ls gs://autobound-news-v3/

# Download JSONL
gsutil cp gs://autobound-news-v3/2026-01-31-17-30-00/output.jsonl ./

# Download Parquet
gsutil cp gs://autobound-news-v3/2026-01-31-17-30-00/output.parquet ./

Tip: You can discover all available signal buckets by running:

gsutil ls -p autobound-signal-delivery

This returns the names of all signal buckets in the project — including buckets you are not licensed for. You can browse bucket names and folder timestamps freely, but accessing the data inside a bucket (reading files) requires objectViewer on that specific bucket, which is granted during onboarding. Unlicensed buckets will return AccessDeniedException: 403 when you attempt to read their contents.


File Formats

Both formats contain the same data. Choose based on your pipeline.

FormatFileBest For
JSONLoutput.jsonlStreaming ingestion, debugging, simple parsing
Parquetoutput.parquetData warehouses, analytics, large-scale processing
JSONL example

One signal per line:

{"signal_id":"7dfdb4b4-c0b4-4620-aca6-e7263123028e","signal_type":"news","detected_at":"2026-01-15T10:30:00Z","association":"company","company":{"name":"Acme Corp","domain":"acme.com"},"data":{"summary":"Acme Corp announces expansion into European markets...","source_url":"https://example.com/news/acme-expansion"}}
Parquet schema
signal_id: STRING
signal_type: STRING
signal_subtype: STRING
detected_at: TIMESTAMP
association: STRING
company: STRUCT<name, domain, linkedin_url, industries, employee_count_low, employee_count_high, description>
contact: STRUCT<first_name, last_name, name, email, job_title, linkedin_url, city, state, country>
data: STRING (JSON-encoded)

The data field is JSON-encoded to support varying fields across signal types.


Refresh Cadence

📘

February 2026 Update: SEC filings, earnings transcripts, news, hiring trends, hiring velocity, and work milestones now deliver weekly (shifted from monthly in February 2026). See the Changelog for full details.

CategoryFrequencyLast DeliveredNext Delivery
SEC Filings (10-K, 10-Q, 8-K, 6-K)WeeklyMar 31, 2026Apr 7, 2026
SEC 20-F Foreign FilingsWeeklyMar 31, 2026Apr 7, 2026
Earnings TranscriptsWeeklyMar 31, 2026Apr 7, 2026
NewsWeeklyMar 31, 2026Apr 7, 2026
Hiring TrendsWeeklyMar 31, 2026Apr 7, 2026
Hiring VelocityWeeklyMar 31, 2026Apr 7, 2026
Work MilestonesWeeklyMar 31, 2026Apr 7, 2026
Patent FilingsMonthlyMar 3, 2026Apr 3, 2026
Website IntelligenceMonthlyMar 4, 2026Apr 3, 2026
LinkedIn Posts (Company)MonthlyMar 16, 2026Apr 13, 2026
LinkedIn Posts (Contact)Bi-weeklyMar 30, 2026Apr 13, 2026
LinkedIn Comments (Contact)MonthlyMar 16, 2026Apr 13, 2026
Glassdoor ReviewsMonthlyMar 13, 2026Apr 13, 2026
Reddit MentionsMonthlyMar 24, 2026Apr 24, 2026
GitHub ActivityMonthlyMar 13, 2026Apr 13, 2026
Product Reviews (G2)MonthlyMar 13, 2026Apr 13, 2026
SEO & TrafficMonthlyMar 25, 2026Apr 25, 2026
Twitter/X Posts (Company)MonthlyMar 13, 2026Apr 13, 2026
Twitter/X Posts (Contact)MonthlyMar 13, 2026Apr 13, 2026
YouTube Activity (Company)MonthlyMar 13, 2026Apr 13, 2026
YouTube Activity (Contact)MonthlyMar 13, 2026Apr 13, 2026
Employee GrowthQuarterlyJan 7, 2026Apr 7, 2026
Financial FundamentalsBi-weeklyMar 31, 2026Apr 14, 2026

Getting Started

  1. Contact [email protected] to get your service account credentials
  2. Choose which signal types you need
  3. Set up authentication and start pulling data