WhatsApp’s AI shows gun-wielding children when prompted with ‘Palestine’::By contrast, prompts for ‘Israeli’ do not generate images of people wielding guns, even in response to a prompt for ‘Israel army’

  • @vacuumflower
    link
    English
    3
    edit-2
    8 months ago

    I’d like to point out that not everything generative is a subset of all the ML stuff. So prejudices in datasets do not affect everything generative.

    That’s off the topic, just playing with such a thing as generative music now. Started with SuperCollider, but it was too hard (maybe not anymore TBF, probably recycling a phrase, for example, would be much easier and faster there than in my macaroni shell script) so now I just generate ABC, convert it to MIDI with various instruments, and use FluidSynth.