comrade_pibb [comrade/them]@hexbear.net to chapotraphouse@hexbear.netEnglish · 2 days agoyeah i read theoryhexbear.netexternal-linkmessage-square40fedilinkarrow-up1158arrow-down11
arrow-up1157arrow-down1external-linkyeah i read theoryhexbear.netcomrade_pibb [comrade/them]@hexbear.net to chapotraphouse@hexbear.netEnglish · 2 days agomessage-square40fedilink
minus-squareprole [any, any]@hexbear.netlinkfedilinkEnglisharrow-up9·2 days agoWhy do you think LLMs use a lot of em dashes if not because the data they are trained on uses a lot of em dashes? And if the data they are trained on uses a lot of em dashes, wouldn’t you expect to see a lot of em dashes in articles?
Why do you think LLMs use a lot of em dashes if not because the data they are trained on uses a lot of em dashes? And if the data they are trained on uses a lot of em dashes, wouldn’t you expect to see a lot of em dashes in articles?