Why it was down?

I think Santa Claus raped my dog.

Oh, I thought we were making random stuff up and pretending it was true. :slight_smile:

Here is one of my early images:
http://i.imgur.com/KG0FaSF.png

Submitted 4 years ago. That was the first image I submitted after I got a real account as Iā€™d have to search for older ones that werenā€™t associated to my account. (I uploaded them with a screen cap tool.)

Think of imgur as the youtube of image serving, I guess.

2 Likes

https://www.quora.com/Imgur-How-long-are-the-images-stored-before-being-purged

"Imgur keep images for ever as long as they are receiving at least 1 view every six months. If they are not they may be deleted to save space. Popular images and those shared on Imgur will probably meet this criteria due to becoming viral and being better accessible.

UPDATE: As of February 2015, Imgur now states that images are never deleted unless requested. "

2 Likes

Cool. So you will also set it up and manage it forever. And collect the money and pay the bills.

Excellent. Just let us know when we can point the domain to you and we can wash our hands of the whole constant maintenance thing you ask of us.

Also this is not really accurate.

We have the 20$/month plan (for free because we have the ā€œpowered by digital oceanā€ thingy)
Next plan is 40$/month and weā€™d have only 60gbā€¦and of course weā€™d be off limit of the free offer with this oneā€¦

2 Likes

End the chain of keyboard violence!

2 Likes

Middle East ā€¦

Yeah, reallyā€¦ everyone can chill now, it was just a suggestion and an attempt to help! I figured someone is already paying for it and doing maintenance, and if thatā€™s the issue some of us could volunteer to help out every so often. But, okay, whatever.

I think that Quora link was where Iā€™d seen the 6 month reference, so it wasnā€™t exactly ā€œmade upā€ and pretend! I somehow managed to overlook the update added.

@nehon I was referring to this: https://www.digitalocean.com/pricing/#storage

I figured you were on the $20 plan and understood the SSD Block Storage as offered in addition to the 40GB in your current plan. Actually, logging into my old accountā€¦ unless this is an invalid configuration below (or they canā€™t increase the size, which would be sad) this would seem to work. Yā€™alls choice, just a suggestion.

(Look, I even used Imgur below to make (almost) everyone happy!) :stuck_out_tongue_winking_eye::sunglasses:

Okay, so now I admit they HAVE made the process of uploading stuff easier than I remember, and the file drop actually worked in Linux Mintā€¦ But it does suck if the site is blocked for someone.

1 Like

But, I can see that the stupid Imgur post cut off the part I was trying to show where it has pricing for block storageā€¦ :laughing: lol Click to see the price if interested.

but thatā€™s just a betā€¦ I think many ppl bet high also on a thing once called somthing like go0cogl3de (the memory of it is becoming garbled)ā€¦

also, thatā€™s quite a different matter, how many ppl all over the world have updated backups of many github repos, including JME? it doesnt even need to be so up to dateā€¦ in a sense that, if you have at least the sha1sum of the latest of everything, any of us could help on restoring a fully funcional copy in case of Sandy (or his zealot) comes by :slight_smile:

I think even Kademlia could be an option, we all care? we all share the responsibility to keep it alive, would be cool.

soā€¦ in the other hand, how many ppl have a full backup of the quite good information we find here? I guess none :confused: ? even some low viewed threads contain some gems, a dropped code with some good tips that let us avoid spend too much time researchingā€¦

Anyway, I think the keyword here could be automation. Why not automate maintenance tasks that optimize disk usage?
. convert images to webp,
. convert videos to hevc/webm,
. audio too?
. fully compress old threads decompressing on demand.
. restrict the number of remote backups to 1, and have a local EXT4 historical backups on 2 cheap 1TB slow external USB HDs.
That all can be easily done on linux with bash scripts and command-line tools (even querying BDs and XML files etc), but I guess the server is not that accessible and may even not be linux, nor have a vmware with access to these files (cygwin may be too slow tho) ā€¦

Who would do that automated maintenance coding work? well, if it could be githubed, any of us could contribute.

1 Like

I suspect Imgur wlll be around until Facebook, Google, Amazon or Microsoft decide to buy it, like every other dang thing being bought up. In fairness, I guess theyā€™d probably keep the old domain/links working for some time even if they rebranded it.

Converting to a more compressed format sounds helpful but Iā€™d be pretty nervous doing that to all the old content (without breaking any links). And all that stuff is a lot of (unfun) work.

The forum doesnā€™t convert stuff to jpg, does it? :cry:

1 Like

Yeah, I saw your other comment in the WIP thread. Wouldnā€™t be very fun telling all the new visitors how to get around that. But I do understand the points being made about the pain of O&M for things here. No perfect solution.

But all of the links would break. Thatā€™s what weā€™re talking about. Quit changing the subject.

Just let us know when we can point the domain to your server.

1 Like

if the server is linux, the filesystem is most probably EXT, therefore accept filesystem level symlinks that are like real files to the applications, ex.:

SomeImg.jpg (real image file)

  1. convert to webp keeping its whole filename, this grants will not conflict with a possibly existing other file named like ā€œSomeImg.webpā€
    SomeImg.jpg.webp (real, quite small, image file)

  2. delete SomeImg.jpg

  3. symlink (this is the magic)
    SomeImg.jpg -> SomeImg.jpg.webp (symlink pointing to the new file)

And then a symlink was created, with the same name of the old file (what all applications will use transparently, w/o problems, nothing will break unless the application is prepared to reject symlinks what would be quite unusual).

An endless stream of mirrored links will never break!

As soon we have something like Kademlia running, even if as a readonly content,
repos, forums, sites, wikis will never more be offline, and in some countries the speed boost may be amazing (as even the personā€™s brotherā€™s computer can be serving itā€™s mirrored copy thru the LAN.

Well it may be not that updated tho.

And that would even not encumber server uploads, it could even lower it, as every personā€™s mirror would be forwarded to other persons all over the world as soon some part is missing, yeah it is like bittorrent.

The idea of having all backed up multiple times and uniformly distributed on the community membersā€™ computers looks progressive, theoretically speaking (naturally, few extra Gbs on local drive dedicated to beloved JME costs nothing to the most of us most of the time, I believe). However, to guarantee sustainability we have to make sure that the community is big enough to have enough peers online to provide necessary bandwidth at any given time of day. If stats analysis doesnā€™t prove otherwise, Iā€™d give it a try at least maybe in some test mode to see real data. If @teique can supply us with clear instruction set of what and how to setup of course :slight_smile:

1 Like

We are talking about overly complex things here IMO.
I guess the monthly gallery is the culprit since there arenā€™t so many images in other threads, if being dependent upon an external hosting is a problem a partial solution might be to create a new section for the monthly gallery and disable the upload only there (not sure if possible with discourse tho), even if some of these images break in 3 years it wonā€™t be a big deal anyway.

3 Likes

I think this could be a start:

  • export each thread in an equivalent of ā€œsave as web page completeā€
  • compress these files of each forum thread, using as file name the thread integer id and the subject.
  • enable kademlia or torrent for these compressed files
    Anyone needing to read some forum thread could look at bittorrent for it.
    After the download, these files would be extracted to be browsed offline.
    As alternative to google, a file contents search could help on finding the wanted topics.

Oh btw, other than exceptions, I only throw ideas, someone else with time, energy and knowledge would be required to put all these together, but I still can provide bits of bash code :slight_smile:

Ho youā€™re right actually we could just get more space for a fair priceā€¦ didnā€™t see that ā€¦
48$ a year for double spaceā€¦ could be the best solutionā€¦

2 Likes