Experimenting with ingesting in SPI data into GPT via a plugin #2369
Replies: 1 comment
-
I'm philosophically opposed - but my opposition isn't to the possible value, but to the idea (perhaps incorrect in my head) that this kind of effort is extremely value-extractive from the community with comparatively little return value for the ones providing the data. Part of this is coming from the stance that all these LLMs have been trained from public data with, I'm expecting, fairly minimal curation. This is in effect doing that fairly expensive curation work FOR OpenAI, and I see little incentive for them to provide compensation back for that effort. On a related note, from a pure cost perspective, I suspect using an LLM-based-search (where you end up paying for each search query) is prohibitively expensive. By comparison if you can summarize-and-store on each change (#2368) - that's likely a far more cost-effective mechanism to leverage the capability for search without driving up the costs for providing that search. |
Beta Was this translation helpful? Give feedback.
-
In addition to #2368, I think there’s another potentially interesting use case for GPT in the Swift Package Index, with plugins.
As I understand them, you can feed up-to-date data into GPT so it can then be queried either via the ChatGPT tool, or via the API. I believe this would enable:
This is NOWHERE NEAR “New Issue” stage yet, but it’s interesting enough that I’d like to investigate some possibilities and I’ll use this thread for updates.
Beta Was this translation helpful? Give feedback.
All reactions