Submitted 4 years ago. That was the first image I submitted after I got a real account as Iād have to search for older ones that werenāt associated to my account. (I uploaded them with a screen cap tool.)
Think of imgur as the youtube of image serving, I guess.
"Imgur keep images for ever as long as they are receiving at least 1 view every six months. If they are not they may be deleted to save space. Popular images and those shared on Imgur will probably meet this criteria due to becoming viral and being better accessible.
UPDATE: As of February 2015, Imgur now states that images are never deleted unless requested. "
We have the 20$/month plan (for free because we have the āpowered by digital oceanā thingy)
Next plan is 40$/month and weād have only 60gbā¦and of course weād be off limit of the free offer with this oneā¦
Yeah, reallyā¦ everyone can chill now, it was just a suggestion and an attempt to help! I figured someone is already paying for it and doing maintenance, and if thatās the issue some of us could volunteer to help out every so often. But, okay, whatever.
I think that Quora link was where Iād seen the 6 month reference, so it wasnāt exactly āmade upā and pretend! I somehow managed to overlook the update added.
I figured you were on the $20 plan and understood the SSD Block Storage as offered in addition to the 40GB in your current plan. Actually, logging into my old accountā¦ unless this is an invalid configuration below (or they canāt increase the size, which would be sad) this would seem to work. Yāalls choice, just a suggestion.
(Look, I even used Imgur below to make (almost) everyone happy!)
Okay, so now I admit they HAVE made the process of uploading stuff easier than I remember, and the file drop actually worked in Linux Mintā¦ But it does suck if the site is blocked for someone.
But, I can see that the stupid Imgur post cut off the part I was trying to show where it has pricing for block storageā¦ lol Click to see the price if interested.
but thatās just a betā¦ I think many ppl bet high also on a thing once called somthing like go0cogl3de (the memory of it is becoming garbled)ā¦
also, thatās quite a different matter, how many ppl all over the world have updated backups of many github repos, including JME? it doesnt even need to be so up to dateā¦ in a sense that, if you have at least the sha1sum of the latest of everything, any of us could help on restoring a fully funcional copy in case of Sandy (or his zealot) comes by
I think even Kademlia could be an option, we all care? we all share the responsibility to keep it alive, would be cool.
soā¦ in the other hand, how many ppl have a full backup of the quite good information we find here? I guess none ? even some low viewed threads contain some gems, a dropped code with some good tips that let us avoid spend too much time researchingā¦
Anyway, I think the keyword here could be automation. Why not automate maintenance tasks that optimize disk usage?
. convert images to webp,
. convert videos to hevc/webm,
. audio too?
. fully compress old threads decompressing on demand.
. restrict the number of remote backups to 1, and have a local EXT4 historical backups on 2 cheap 1TB slow external USB HDs.
That all can be easily done on linux with bash scripts and command-line tools (even querying BDs and XML files etc), but I guess the server is not that accessible and may even not be linux, nor have a vmware with access to these files (cygwin may be too slow tho) ā¦
Who would do that automated maintenance coding work? well, if it could be githubed, any of us could contribute.
I suspect Imgur wlll be around until Facebook, Google, Amazon or Microsoft decide to buy it, like every other dang thing being bought up. In fairness, I guess theyād probably keep the old domain/links working for some time even if they rebranded it.
Converting to a more compressed format sounds helpful but Iād be pretty nervous doing that to all the old content (without breaking any links). And all that stuff is a lot of (unfun) work.
The forum doesnāt convert stuff to jpg, does it?
Yeah, I saw your other comment in the WIP thread. Wouldnāt be very fun telling all the new visitors how to get around that. But I do understand the points being made about the pain of O&M for things here. No perfect solution.
if the server is linux, the filesystem is most probably EXT, therefore accept filesystem level symlinks that are like real files to the applications, ex.:
SomeImg.jpg (real image file)
convert to webp keeping its whole filename, this grants will not conflict with a possibly existing other file named like āSomeImg.webpā SomeImg.jpg.webp (real, quite small, image file)
delete SomeImg.jpg
symlink (this is the magic) SomeImg.jpg -> SomeImg.jpg.webp (symlink pointing to the new file)
And then a symlink was created, with the same name of the old file (what all applications will use transparently, w/o problems, nothing will break unless the application is prepared to reject symlinks what would be quite unusual).
An endless stream of mirrored links will never break!
As soon we have something like Kademlia running, even if as a readonly content,
repos, forums, sites, wikis will never more be offline, and in some countries the speed boost may be amazing (as even the personās brotherās computer can be serving itās mirrored copy thru the LAN.
Well it may be not that updated tho.
And that would even not encumber server uploads, it could even lower it, as every personās mirror would be forwarded to other persons all over the world as soon some part is missing, yeah it is like bittorrent.
The idea of having all backed up multiple times and uniformly distributed on the community membersā computers looks progressive, theoretically speaking (naturally, few extra Gbs on local drive dedicated to beloved JME costs nothing to the most of us most of the time, I believe). However, to guarantee sustainability we have to make sure that the community is big enough to have enough peers online to provide necessary bandwidth at any given time of day. If stats analysis doesnāt prove otherwise, Iād give it a try at least maybe in some test mode to see real data. If @teique can supply us with clear instruction set of what and how to setup of course
We are talking about overly complex things here IMO.
I guess the monthly gallery is the culprit since there arenāt so many images in other threads, if being dependent upon an external hosting is a problem a partial solution might be to create a new section for the monthly gallery and disable the upload only there (not sure if possible with discourse tho), even if some of these images break in 3 years it wonāt be a big deal anyway.
export each thread in an equivalent of āsave as web page completeā
compress these files of each forum thread, using as file name the thread integer id and the subject.
enable kademlia or torrent for these compressed files
Anyone needing to read some forum thread could look at bittorrent for it.
After the download, these files would be extracted to be browsed offline.
As alternative to google, a file contents search could help on finding the wanted topics.
Oh btw, other than exceptions, I only throw ideas, someone else with time, energy and knowledge would be required to put all these together, but I still can provide bits of bash code
Ho youāre right actually we could just get more space for a fair priceā¦ didnāt see that ā¦
48$ a year for double spaceā¦ could be the best solutionā¦