Safe Search Vision moderations API for detecting child porn

Good day @jasonsaeho ,

Welcome to Google Cloud Community!

I’ll try to answer your questions:

  1. Since it contains adult sexual content, Google Safe Search should be able to classify it as adult (VERY LIKELY)

  2. If it does not contain adult sexual content but it contains sexual acts, it may be classified as racy (LIKELY)

  3. This may also be classified in the category of adult (VERY LIKELY).

  4. Inappropriate poses may fall under the category of racy (LIKELY).

Please note that it is vital to keep in mind that interpretations of what constitutes an unacceptable pose or act may vary, there may be times that other steps are required.

You can also check this blog which discusses about the classification of Safe Search: https://cloud.google.com/blog/products/ai-machine-learning/filtering-inappropriate-content-with-the-cloud-vision-api

You can also try performing a safe search on your local file, you can use this link to learn more: https://cloud.google.com/vision/docs/samples/vision-safe-search-detection

Alternatively, You can also train a model that will classify these pictures based on your objectives. You can check this link for more information: https://cloud.google.com/vertex-ai/docs/training-overview

Hope this helps!

1 Like