[分享]ooeedd 2021.12月 源数据一份 供大家洗版玩

Hello, I have one question about mp3 downloading. I have extracted all mp3 links from the html files. Now I need to download all of them using wget. I’ve put all the 467484 urls in a text file. I know wget can resume downloading a file by using -c, but in case of an input text file above, I can’t download all of those mp3 in one day, so I need to Ctrl + C to pause wget and resume the next day. Will -c be able to resume downloading the rest of the mp3?

2 个赞

i don’t have a history with wget, how about any GUI downloader?

After tried many times, I’m enough with the downloader from browser, and I searched downloader in software center, then I chose uGet, and it perfectly works without complaining me about network error.

2 个赞

I’ve tried JDownloader and Free Download Manager, but they don’t support downloading from input texts. I’ve tried wget, and it supports downloading from input texts. The speed is good, and it also retains file names, which is very important. I’m just not sure if it can resume downloading the rest of the files. I’ve set up Windows Linux Subsystem just to get wget. I have two options. Download from all the links with pauses and resumes, or splitting the link text, and download in smaller batches.

omg, if you use Windows, give a try on crack version of IDM. that’s more pro on my another windows 7, godlike powerful.

1 个赞

Does it support links from a text file? I’ve just completed the first batch of mp3 links. Of 10k links, 20 is missing. Quite annoying. I’ll try IDM next.

Hello,

not really familiar with advanced wget commands but as you suggested in [分享]ooeedd 2021.12月 源数据一份 供大家洗版玩 - #46,来自 RandomTelegram you could split the link text file and run wget multiple times to download the files. to ensure that all files are downloaded, you could run wget again against a text file after the download has finished with the -N flag, this will ensure that only newer files will be downloaded and overwritten and any links missed will be downloaded.

you do not need to set up WSL in order to run wget you can get a window executable from here: GNU Wget 1.21.3 for Windows

Thank you very much. I’m using wget with WSL to download mp3 files in smaller batches. I’m almost halfway through the whole queue. However, some links are dead. Don’t know how to replenish them.

could you give me a list of the links that are dead ? perhaps i can help.

1 个赞

Here’s for the first batch as an example. There are still more. When I try to open these links, I get hit with the error 500. So, basically, the files don’t exist anymore. To replenish these dead mp3’s, I need to identify what they are about, locate them on the web site, fetch the new mp3’s and replace the dead ones. It’s a troublesome job, code-wise.
dead_links.txt (3.4 KB)

1 个赞

here are the sounds:

Dead Links - Sounds.rar (259.7 KB)

please compile the sound links that are broken in a text file and i’ll upload them for you.

2 个赞

Thank you very much :smiling_face_with_three_hearts:. How can you download them? When I checked them on the website, they weren’t there. I’ll compile the whole list for the dead links.

you welcome. the reason i have the files is that i downloaded all of the sound files when the shared entry files were still fresh and so i have them lying around.

1 个赞

Here’s the complete list.
dead_links.txt (1.3 MB)

2 个赞

Here you go:

3 个赞

Thank you very much for the files.

1 个赞

can i use it either?

1 个赞

Feel free to use the sounds, the share is everybody’s.