User:Tim1357

Retired
This user is no longer active on Wikipedia.


User:Alison/OTRS
This user had an account on the Wikimedia Toolserver.
This user has rollback and pending changes reviewer rights on the English Wikipedia. (verify)
This user has autopatrolled rights on the English Wikipedia. (verify)
This user attends or attended
Cornell University.
This user runs a bot, DASHBot (contribs). It performs tasks that are extremely tedious to do manually.


This user is proud to be a resident native citizen of the U.S. State of Vermont.
This user reviews dataset edits for ClueBot NG to help automatically mass revert vandalism on Wikipedia.






What I do

[edit]


To Do

[edit]


  • Remove |needs-image= from Film articles that have sufficient images.
  • I like being a garbage person who likes to be a dick
Done

Green tickY Rebuild CatTrack Database

Green tickY Whittle away at Category:Film articles needing an image

Green tickYRebuild script to use http://movieposterdb.com

Green tickYBuild/Test/Implement a bot to move reviews from {{Infobox Album}} to {{Album reviews}}


Green tickY Re-write wikiproject_watchlist.py to use recentchanges table.


Toolserver

[edit]

Production

[edit]

The Wikiproject Watchlist is a tool designed to help wikiprojects keep track of their articles. It is made to look and feel similar to the watchlist that wikipedia users have access to. It accepts a wikiproject baner template or category.


Hot Articles produces the most edited articles that are in the domain of a cetrian WikiProject, over a certain period of time. It has an API that User:HotArticlesBot uses to generate it's lists.


Cat Track logs the number of pages in a category over time. It then produces a graph showing the population of that category, which is intended to be useful in monitoring the status of backlogs. I use the google visualization API, so its nice and intuitive (although it does require flash) See the page tracking the number of unreferenced BLPs as an example. The tool also has the option to export the raw data (in CSV and WikiText format).


Development

[edit]

This tool is in development: intended to create graphs for use with Sock Puppet Investigations. Currently it is limited to administrators on the English Wikipedia (users must log in using their TUSC account). It relies on Google's visualization API, and crashes when users have many edits (because it renders the scatterplots client-side). I don't know, either, how useful it may be.


This tool ties together Google Maps and the wikipedia API. It allows users to search within Google Maps, drag and drop pins, and then save to Wikipedia in one click. It's still kind of buggy and I'm not sure how useful it really is. Anyways, if you want to see something improved/implemented: leave me a message on my talk page.

Example: Output for User:Example

This was an idea I played with a while back. Basically, I was playing around with X!'s (awesome) edit counter tool and was given an error that the request I'd given had taken up too much memory. I found out that this happened because X!'s tool pulls all of an editor's edits into memory, and then computes the edit time graphs (the bars that indicate how many edits you've made to which namespaces by moth). I decided it'd be nice to see if I could do this entirely on the database side, saving Toolserver resources. I did, and it's faster too! I haven't gotten X! to use it yet, though.

Barnstars

[edit]








Committed identity: 281f8d1695d0810c0bd305f556b56448 is a md5sum commitment to this user's real-life identity.