User:16dvnk
| This is a Wikipedia user page. This is not an encyclopedia article or the talk page for an encyclopedia article. If you find this page on any site other than Wikipedia, you are viewing a mirror site. Be aware that the page may be outdated and that the user whom this page is about may have no personal affiliation with any site other than Wikipedia. The original page is located at https://en.wikipedia.org/wiki/User:16dvnk. |
| This user is a WikiDragon: making massive, bold edits everywhere. |
| This user is a member of the Technology WikiProject. |
| This user is a Reichstag climbing patroller. |

Self Introduction
[edit]16dvnk is my username as an independent developer and thinker working at the intersection of artificial intelligence and political logic. My focus is on building language models with precision, emergent capabilities.
Work
[edit]I design and train my own models under the name AaI, aiming for clean, interpretable behavior with minimal hallucinations. My current release is:
- AaI mini+ alpha+ 0729
A self-trained 14M parameter model with decent stopping and reasoning. It’s designed for controlled output, clarity, and logical discipline even at small scale. It is trained on a Python library and by a Nvidia Geforce RTX 4080 GPU.
Articles I've created
[edit]Below is the articles I have created, ranking from first made to last made:
User pages
[edit]Wiki pages
[edit]- Knowledge cutoff (currently C tier)
Philosophy
[edit]I approach AI with a layered view of cognition:
- Awareness → Consciousness → Understanding → Truth → Wisdom (It was generated by my own model)
This framework guides how I structure training, evaluation, and behavior modeling. I value strategic minimalism — fewer parameters, more insight.
Forecasting
[edit]Using ensemble techniques and original logic models, I’ve projected trends in economics and politics. For example:
- 2028 U.S. election forecast (outdated)
Harris/Shapiro: 303 EV Vance/Haley: 235 EV
- Long-term projection (with very brief researching)
China is likely to surpass U.S. GDP by ~2035, based on compound modeling and growth asymmetry.
Practices
[edit]I value complete information, especially in code. I prefer full implementations over snippets. I train my AI Model with clean datasets like Simple English Wikipedia, with strict constraints against hallucination. However, the anti-hallucination practise is not working yet, as 27M prams is not enough. I will continue to work on the project until it reaches my standards.
I also edit and structure Wikipedia-style pages for clarity, formatting accuracy, and durable knowledge representation with assistance of AI tools by checking formatting and fixing bugs. I have edited in the Scratch Wiki for over a year, so i have experience with making wiki articles.
To do:
[edit]For article Knowledge cutoff
[edit]Main concern
[edit]"I'm afraid I'm going to have to quick-fail this. The first thing that jumped out at me was the use of Fox News as the most cited source in this article. See WP:FOXNEWSSCIENCE. For an article about a technical subject like this at the GA level, I would expect to see mostly sources that specialize in tech, and if any general audience media were used, at least only the highest quality such as the TIME source that was included.In addition to that, there's an entire paragraph that's lifted almost verbatim from technologyreview.com. And my one foray into fact checking was to look at This is caused by the fact that almost all large language models are trained on static datasets, and training on newer data would cause a major price concern, given that training the most powerful large language models may soon cost over a billion dollars according to Time.[3]
for which I found that the source says nothing about static datasets."
I hate to sound harsh, but this is the third quick fail in a row. I strongly suggest you do not bring this back to WP:GAN. A forum like WP:PR might be a better place to get feedback from other editors. RoySmith (talk)19:01, 30 August 2025 (UTC)"
This is a constructive review.
- *hopefully gets accepted* To do (wishful thinking)
- remove/ reduce fox news sources and replace them with better sources To do
- remove the copyvio Done
- fix the time thing Done
- upgrade the sources to the time level To do
- DO A PR INSTEAD OF GANING To do