Talk:Graphics processing unit
This is the talk page for discussing improvements to the Graphics processing unit article. This is not a forum for general discussion of the article's subject. |
Article policies
|
Find sources: Google (books · news · scholar · free images · WP refs) · FENS · JSTOR · TWL |
Archives: Index, 1Auto-archiving period: 3 months |
This level-5 vital article is rated B-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | |||||||||||||||||||||||||||||||||||||
|
Bitcoin & prime numbers & factoring
editQ: is there any benefit from mentioning non-traditional industrial applications for GPUs? (a) calculating prime numbers for use in factoring (b) Bitcoin mining (c) other --Howard from NYC (talk) 16:01, 7 February 2022 (UTC)
- A: That should already be adequately covered by the GPGPU section and the linked main article, or could perhaps be added there. --Zac67 (talk) 16:10, 7 February 2022 (UTC)
Is this supposed to be about GPUs or the companies that develop them?
editWhy do so many tech articles end up being about the companies that develop the tech, rather than the tech itself?
I think most people come to this page to learn about GPUs. I'd like to see a simple example of how they take an input, process an algorithm in parallel and produce an output that can be displayed on a screen.
Perhaps the article contains this, but if so it's buried in hundreds of lines of historical references and lots of "who did what first" for companies like Sony, Toshiba, Nvidia, ATI, Midway, Taito, Namco, Centuri, Gremlin, Irem, Konami, Midway, Nichibutsu, Sega and more.
One could write a bot to write these articles and program it with a few basic patterns, e.g. X1 built a Y1 that had Z1. Then X2 built a derivative called Y2 that had Z2. Meanwhile X3 build a totally different Y3 that had Z3. Ultimately X2 acquired both X1 and X3. It's kind of silly.
174.208.166.133 (talk) 16:49, 23 August 2022 (UTC)
- This article is about different aspects of GPUs and some emphasis on their history and stages of development which is what other readers were interested in. I think what you're looking for might be found in Graphics processing unit#Stream processing and general purpose GPUs (GPGPU) and the articles linked to in that section. Also note that Wikipedia is an encyclopedia and no how-to. --Zac67 (talk) 17:19, 23 August 2022 (UTC)
SGI, GPU and OpenGL
editNice, maybe add a little bit more about SGI, their GPUs and OpenGL. 2A04:4540:6A28:CF00:5DA9:C4D3:C033:13C3 (talk) 15:53, 9 February 2023 (UTC)
Intro seems odd to single out Chat-GPT
editThe part about GPT and Chat-GPT seems out of place with standard Wikipedia intros. It should probably instead link to AI/ML in general and the LLMs page instead of a specific LLM. Didn't want to straight remove it as wanted opinions but it seems very off to me Greenking2000 (talk) 09:47, 6 July 2023 (UTC)
- You are right. I got carried away. I should have stopped at mentioning LLMs, as you suggested. Feel free to remove it. DancingPhilosopher (talk) 10:26, 6 July 2023 (UTC)
- Completely agree. I've removed the unsourced claim (since February) and the undue weight. --Zac67 (talk) 11:29, 6 July 2023 (UTC)
- I've edited your edit to remove LLM mention entirely and just mention AI/ML training and added crypto mining as that is another thing that GPUs are good at.
- I dont think we need to mention specific AI/ML types as they are all trained with GPUs Greenking2000 (talk) 14:34, 8 July 2023 (UTC)
Worth noting recent potential security issues related to GPU architectures?
editRead this article and it seems like an interesting note regarding how GPUs have been optimizing image processing.