Note;
1B = byte = 1 byte = 8 bits
1kb = kilobit = 1000 bits
1Kb = Kilobit = 1024 bits
1KB = KiloByte = 1024 bytes = 1024 * 8 bits = 8.192 bits
I appologize for the size of this post. I don't post too often, yet when I do, it gets messy
.
End_of_note;
A purge including only file deletion is out of the question - we all know this.
No one would like his file to get deleted, just like that, on subjective reasons.
Tight rules must be implemented, otherwise this issue will occur again. Sooner or later, it will appear again.
A more complex yet community-dependent idea would be to issue 2 ratings to a map: 1 for admins and 1 for users. This way, all moderators of "Maps" could rate the map. In the end, the server computes an average - that's the Admin Rating.
Users should also rate the file after they have downloaded it. I would suggest enforcing a download restriction for files in the same section: if one downloaded a map, he/she shouldn't be able to download one more untill he/she rates that map.
It would be rather frustrating, tough effective (as anyone would soon be forced to begin rating files they download if they want to download another). So, perhaps you can change the site coding, implementing a new user section in the User Control Panel/User Console called "Ratings". There should be created entries to all downloaded files. If the user does not rate let's say, 50 files per Site section, he should be denied downloading more files from the same section untill he rates at least 50% from his downloads - or 100%. When he/she reaches the 50 files threshold, the restriction is activated again and so on, untill users learn to always rate files.
With user ratings, the site could then compute a new average - the User Rating.
Based on these 2 ratings the site would finally compute a single rating - the File Rating - let's say about 75% Admin Rating + 25% User Rating - or 50% each, you decide.
Using a certain rating minimal value, after 1 or 2 or 6 months the site would compute a threshold. Take the file with most downloads ever in each section(considering it the best file/most popular file in it's section). Take the file currently checked in the same section. Compute an average downloads-per-day value for each file. Compute what percentage does the currently checked file's download average represent from the download average of the most popular file.
With this value and the File Rating, the site could decide either to issue a warning to the author of that file, or to delete it. Or, to highlight it to a moderator. Whatever the Admins decide to program the site to do.
The age of a file shouldn't be a variable. Computing that downloads/day average uses the age of that file, yet it doesn't use it directly, it's rather computing the effectiveness that file has in download terms. W3 Viewer was created, let's see, back in 2003? 2004? So it's more than 2 year-old. Should it be deleted just based on that? No.
Yeah, this would create a whole lot of text entries (worst case, every user has been rating every file on WC3S, so that would lead to TotalNrOfFilesOnWC3S * TotalNrOfUsers entries). I just wrote a 268.435.456 bytes (256 MB) text file with every character in ASCII from #0 to #255. Compressed with Zip at Best (using WiRAR), it got down to 1.041.330 bytes (0,9930896759033203125 MB) - that would be 0,3879256546497344970703125 %.
Not that big anymore, huh?
1) Maps.
I do not know what to say about the maps section. It is not my field of expertise in W3, so I leave this open for you.
Main idea would be to have a size limit corresponding to it's dimensions (map dimensions as W3 uses it, not as filesize). Then, maps should receive a rating, based on a composite metric: complexity & playability & overall quality & detail level & blabla - as I said, this is not my domain, so I do not know what fields should be present here.
2) Skins & Icons.
Detail level is a REQUIRED field for these sections. The skins/icons would better be obtained by merge/scratch work. Recolors are out, as copy-paste are too.
For Icons, a BMP with 32bpp is max 64(width) x 64(height) x4(32bpp = 8 Bpp) = 16.384 bytes (16 KB) large. A full pack would be 4 times that, so 64 KB per icon, provided it comes with all BTN, DISBTN, PASBTN, DISPASBTN BLPs.
For skins, a BMP with 32bpp is max 256(width) x 256(height) x4 = 262.144 bytes (256 KB) large.
These were BMP uncompressed sizes.
Now, BLP compression kicks in - reducing as much as it can. Zip comes next.
I'd make a suggestion here. Wouldn't it be better if WC3S would only host the original image? For example, just the BTN image in BMP 16/24bit format or the BLP, compressed as ZIP. The download link should have below it a download link to a BMPtoBLPconverter/Warcraft3 Viewer or any other program capable of converting a BMP to a BLP with an user-friendly interface (we all were sometimes wondering what the Hell should we do with that ZIP file
) with minial input requirement and composing the required BLP files only from the source image (from only the BTN BMP file to all the 4 BLP files, BTN, DISBTN, PASBTN and DISPASBTN, also adding the Aplha mask).
This would get the pack size to a theoretical 25% (1 out of 4 images).
From my MedalionOfWisdom (BTN image, 64x64 24bpp BMP file):
BMP 32bpp: 16.440 bytes -> 12.150 bytes of ZIP archive
BMP 24bpp: 12.344 bytes -> 10.883 bytes of ZIP archive
BMP 16bpp: 8.248 bytes -> 6.057 bytes of ZIP archive
BLP : 10.498 bytes -> 8.093 bytes of ZIP archive
The BLP file was composed from the BMP 24bpp and was added an Alpha Mask by Warcraft3 Viewer.
After all, the DIS, PAS and DISPAS are not distinct images compared with BTN. So, those files are just ballast.
Implementing a software-independent program that could offer centralized image conversion and processing (allowing the user to obtain any of the BTN, DISBTN, PASBTN or DISPASBTN BLP files only from the BTN source image - either TGA, BMP or JPG) would lift the burden of space waste from the server.
3) Models & Spells.
I am no modeller, so I am not the one to suggest rules here, but a general Quality versus Size rule is necessary.
You can't delete a file only based on it's age or size. These factors are generic. A rule based on quality it's definetly the best option. Poly count is critical.
I am not familiar with what does W3 do with skins, tough... Are they temporarily extracted to a temp folder? If so, a large number of textures per model would lead to free-space fragmentation. Ugly. Users should anyway be encouraged to use a number as low as possible of texture files per model.
No other ideas here...
4) AI & JASS.
The AI & JASS files are, if I recall, text files. So, ZIP should be enough for them. Or, RAR with Solid Archive set to true
.
5) Art.
This is fan-related. I don't know if any rules are/can be rephrased here... It's a question of personal taste, don't you think?
I'd suggest encouraging users to host their files to sites like ImageShack.us. After all, these are personal files, so IS.us shouldn't have anything to say about it. A thumbnail and an URL would be enough in WC3S.
6) Tools.
Hm...
This is the category where I tried to upload yet encountered a refusal, so I may not be the person to be considered 100% objective.
Tell you what? I'll try to be 75% objective. I guess is the best I can do.
As tools require programming skills, I don't think they are a serious problem for WC3S. First of all, because they are not going to be like thousands of them, as Icons/Models are. On the other hand, they can be very large.
Some rules I'd suggest here: no tools platform-dependent should be allowed. If you do not accept icons for models that do not exist for the time being, why should you allow such tools? I downloaded some while ago the W3 Damage Calculator (Yeeeah! only around 500KB! a joy for my crappy dialup connection, that offers me some 1,5/2 KB per second, 3 KB/sec in the good times!) and, when it was all over and I tried to run it, I have been announced that I need some .NET framework installed to run the damn thing!
Please Note: I express no opinion regarding W3DC's quality: I was unable to run it. I only refer to ability to run a program.
So, Please do not accept tools that can't be run independently, as standalone EXE file. Even if they do need some DLLs or other files, they must be WITHIN the download archive, not on third-party website.
On the other hand, there shouldn't be like 50 tools for the same function. If you could, only accept or encourage at least those tools having implemented the most functions!
Tools should be allowed to be complex as long as they are compact. Egonomic interfaces should be encouraged by pertinent feedback such as comments and such, while the number of files per program should be kept minimum.
Tools should have complete documentation along the program files. Yes, it's good to help/be helped by forum members, but always seeing the same thread title/question in comments as "Why can't I run the program" gets both tiresome and increases the number of text entries in forums, making it more of a jungle. Basic usage tutorials within download ZIP archives should at least be encouraged if not required to exist.
7) Forums.
Implement an archive module for posts older than 1 year, for posts that have not been viewed for a long time or for posts that have not been replyed in a long time.
After 1,5 / 2 years, either pack them as ZIP/RAR/CAB/ACE/UHA or delete them. Although, keeping a record would be interesting.
Donations should and perhaps ought to be made more often, provided you're in a country with PayPal support...
If the server is Win-based, perhaps defragmenting it from time to time would lead to improved speeds and lower free space fragmentation?
Over all, composite metrics would be the best option in defining the "quality" aspect of any uploadable resource.
Well, these are my suggestions. Feel free to comment them (using this format please: "5) Art: ", so that I/others can identify better what was commented).
Hope I didn't offend anyone out there.
EoF.BlackDoom;