class Google::Apis::VisionV1::GoogleCloudVisionV1p3beta1SafeSearchAnnotation

Set of features pertaining to the image, computed by computer vision methods over safe-search verticals (for example, adult, spoof, medical, violence).

Attributes

adult[RW]

Represents the adult content likelihood for the image. Adult content may contain elements such as nudity, pornographic images or cartoons, or sexual activities. Corresponds to the JSON property `adult` @return [String]

medical[RW]

Likelihood that this is a medical image. Corresponds to the JSON property `medical` @return [String]

racy[RW]

Likelihood that the request image contains racy content. Racy content may include (but is not limited to) skimpy or sheer clothing, strategically covered nudity, lewd or provocative poses, or close-ups of sensitive body areas. Corresponds to the JSON property `racy` @return [String]

spoof[RW]

Spoof likelihood. The likelihood that an modification was made to the image's canonical version to make it appear funny or offensive. Corresponds to the JSON property `spoof` @return [String]

violence[RW]

Likelihood that this image contains violent content. Corresponds to the JSON property `violence` @return [String]

Public Class Methods

new(**args) click to toggle source
# File lib/google/apis/vision_v1/classes.rb, line 6107
def initialize(**args)
   update!(**args)
end

Public Instance Methods

update!(**args) click to toggle source

Update properties of this object

# File lib/google/apis/vision_v1/classes.rb, line 6112
def update!(**args)
  @adult = args[:adult] if args.key?(:adult)
  @medical = args[:medical] if args.key?(:medical)
  @racy = args[:racy] if args.key?(:racy)
  @spoof = args[:spoof] if args.key?(:spoof)
  @violence = args[:violence] if args.key?(:violence)
end