npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

react-native-nsfw

v0.0.2-alpha.0

Published

Provides a function to detect nsfw images on iOS using Core ML

Downloads

7

Readme

react-native-nsfw

NSFWDetector is a small (17 kB) CoreML Model to scan images for nudity. It was trained using CreateML to distinguish between porn/nudity and appropriate pictures. With the main focus on distinguishing between instagram model like pictures and porn.

This package is a React-Native wrapper around NSFWDetector and was used by me to learn working with Expo's Sweet API. I would highly appreciate contributions for Android and Web (there are some very good JS libs already). It's even possible to implement this for Camera.

Expo

Even though this project uses expo-modules-core, it's not working with Expo Go, since it adds native code and is not part of the Expo ecosystem (and not related).

Installation in React Native projects /Expo Custom Dev Client

You must ensure that you have installed and configured the expo package before continuing. This is expected to take about five minutes. The footprint is very small and our modules benefit from JSI under the hood. If you're already using Expo with a custom dev client (good choice), you can skip this step.

Add the package to your npm dependencies

yarn add reat-native-nsfw

Configure for iOS

Run npx pod-install after installing the npm package.

Configure for Android

Not supported yet. No-op currently missing, but will be added soon.

API documentation

import * as NSFWDetector from "react-native-nsfw";

const { isNSFW, confidence } = await NSFWDetector.detectAsync(uri, threshold);

Supports all image types which expo-image-loader can resolve. (which is basically all you need).

| Arguments | Description | | ------------------- | --------------------------------------------------------------------------------------------------------------------------- | | uri (string) | URI of the file to manipulate. Should be on the local file system or a base64 data URI. Remote files are not supported yet. | | threshold? (number) | At which confidence an image should be classified as NSWF.Range between 0.0 and 1.0. Defaults to 0.9optional | | | |

License

MIT