Hacker News new | past | comments | ask | show | jobs | submit login

It was insipred by these authors, not created. I won’t claim where copyright/authorship should be, but this reduction makes less sense than needed for important definitions.



Software does not get inspired


Now it does.


Inspiration implies some intrinsic creativity that gets an impulse from someone else's work, but is not determined by it. We don't know how creativity and inspiration works in humans, but we know for sure that response of a generative model is 100% determined by its weights, which in turn are determined by the data it consumed during training, and the prompt.


So what. Why do you think that determinism should play any role here?


2 + 2 = 4

Nothing inspired or creative about it. It comes out the same every time no matter who calculates it. That’s what people want to copyright except the first 2 is other people’s art and the second 2 are the weights.

Why do you think it doesn’t matter?


Because it's not a process of copying.


Determinism or randomness should play no role here. The fact that the output is a function of model, training data and prompt (and nothing else) makes the result a derived work from the training data.


By stretching "derived" that far you inevitably cover pre-AI human inspirations. It doesn't make sense.


It makes sense if you don’t think of humans as tools and think of art consumption pre-AI. Very little would be made public if people thought their work would get scooped up to create these models.

I, and others, don’t care if the process of observing and generating output works the same way because people and tools don’t need to be held to the same standard. As I’ve said numerous times elsewhere, big picture, AI and humans are different in practically every other way and that is seemingly never taken into consideration when promoting AI adoption. It’s also a big leap to say we fully understand how human creativity works which is still under study.

You may, I could be wrong, believe oppositely because it means you can benefit directly from these tools but other people view it through the lens of what they or others may lose and that is no less rational. Social constructs require social consensus. If you got rid of capitalism people might be more open to your viewpoint but as it stands this just smells like socializing human creativity for free, in a way people could never anticipate, to make other people money, no doubt consolidating more power in corporations, who can afford to run the models at scale, as is standard.

I suppose we’re unlikely to agree so I’ll leave it at this.


AI is always human until it isn’t. It’s not copyright infringement if the AI learned from input like a human but also it’s just a tool that can’t copyright its own output. It’s a moral imperative someone ought to make money off that /s


Most importantly it can't be held liable like a human.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: