In this workflow, you'll be able to see what tech stack a company is using with Claygent, if you want to conserve credits not spend them on the BuiltWith enrichment option. This workflow will
- Clean the company domain
- Generate the company's BuiltWith URL
- Use Claygent to determine whether a specific tool or tech stack is used by that company
For more detailed instruction, visit here.
This workflow uses Claygent to enrich school districts based on numerous data points. Specifically, you'll be able to find out:
- School district URL
- School superintendant
- School district size
- Student-teacher ratio
- Average SAT score
- Average ACT score
- City Locations
This recipe creates a custom image for use in a campaign. The image is templatized just like an email, with the ability to create custom colors and add custom images into a campaign. This uses a tech called Dynapictures, and is a higher-effort campaign. However, if done correctly it can also show real effort and work better than a plain text campaign.
This first template shows how the HTTP API integration is oriented so it can send pictures (in this case, the person's LinkedIn profile picture) to Zapier. From Zapier, it connects to Dynapictures and creates the image, which then sends to the second table.
To continue this play, read more on the second table in this sequence
This workflow will allow you to scrape Indeed using Browse AI and send that data over to Clay for enrichment every day based on the criteria you set. You can extract different details depending on your outbound use-case. To do this, you need to:
- Go to Browse AI and pick an Indeed scraping template.
- Copy Browse AI's webhook URL.
- Go to your Clay table and pull in data from a Webhook. The scraped data will write to another table where you can perform more enrichment.
- Use this template to find company job listings, job titles, job locations, salaries, and more.
In this workflow, you'll be able to
- Scrape search results of affiliate websites using Serper
- Scrape affiliate websites for companies that are typically hard to generate a lead list from
- Compile them into JSON format to export to a different table with each scraped domain in a separate row
To run this play, you will need your own Serper.dev API Key.