npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

scrapedogg

v1.1.2

Published

<h1 align="center">Welcome to ScrapeDogg 🐕</h1> <p> <a href="https://www.npmjs.com/package/scrapedogg" target="_blank"> <img alt="Version" src="https://img.shields.io/npm/v/scraper.svg"> </a> <a href="#" target="_blank"> <img alt="License:

Downloads

10

Readme

About

ScrapeDogg is an experimental library for scraping websites using OpenAI's GPT.

The library provides a means to scrape structured data from HTML without writing page-specific code.

⚠️ "Important" ⚠️

Before you proceed, here are at least two reasons why you should not use this library:

* It is **very experimental**, no guarantees are made about the stability of the API or the accuracy of the results.

* It relies on the OpenAI API, which is quite slow and can be expensive. 

**Use at your own risk.**

Quickstart

Step 1) Obtain an OpenAI API key (https://platform.openai.com) and set an environment variable in the .env file:

OPENAI_API_KEY=sk-...

Step 2) Install the library however you like:

npm install scrapedogg

or

yarn add scrapedogg

Step3) Initialize a URL and a Schema by indicating the data you wish to extract:

let url = "https://books.toscrape.com/catalogue/a-light-in-the-attic_1000/index.html";
let schema = {
  title: "string",
  book_cover: "url",
  url: "url",
  price: "string",
  stock: "int",
  rating: "int/int stars",
  description: "string",
};

Step 4) Passing the scraper a URL to the resulting scraper will return the scraped data in the console:

(async () => {
  console.log(await scrape(url, schema))
})();
{
    "title": "A Light in the Attic",
    "book_cover": "https://books.toscrape.com/media/cache/fe/72/fe72f0532301ec28892ae79a629a293c.jpg",
    "url": "https://books.toscrape.com/catalogue/a-light-in-the-attic_1000/index.html",
    "price": "£51.77",
    "stock": 22,
    "rating": "3/5 stars",
    "description": "It's hard to imagine a world without A Light in the Attic. This now-classic collection of poetry and drawings from Shel Silverstein celebrates its 20th anniversary with this special edition. ...more"
}

That's it! 🎉

Prerequisites

This project requires NodeJS (version 8 or later) and NPM. Node and NPM are really easy to install. To make sure you have them available on your machine, try running the following command.

$ npm -v && node -v
6.4.1
v8.16.0

Install

npm install ScrapeDogg

Run tests

npm run test

Contributing

  1. Fork it!
  2. Create your feature branch: git checkout -b my-new-feature
  3. Add your changes: git add .
  4. Commit your changes: git commit -am 'Add some feature'
  5. Push to the branch: git push origin my-new-feature
  6. Submit a pull request 😎

Credits

  • scrapeghost
    • Huge Thanks to scrapeghost for inspiring me to make this idea in JS, Go check out their python version, Thanks!

Show your support

Give a ⭐️ if this project helped you!