aMule Forum

English => Feature requests => Topic started by: nachbarnebenan on December 20, 2004, 07:03:55 PM

Title: crazy idea
Post by: nachbarnebenan on December 20, 2004, 07:03:55 PM
ok, i dunno if this will work at all and how much it will affect performance, but i thought about implementing a file lock mechanism to reduce incompleteable files. it could work like this:
1. file is added to download list (either via search or uri) and first gets "locked" status.
2. while locked, sources are searched with low priority, but no upload queue place is taken until all parts of the file have been seen (not neccessarily at the same time).
3. when all parts are there, the lock is opened and everything from this point on works like it does now, except the already found sources are kept of course.
the lock can be opened manually and the file could be shown white in the dl list while locked.

i guess, this could decrease dead files be at least one third or more. and it will prevent wasting credits for incompleteable files :)
Title: Re: crazy idea
Post by: GonoszTopi on December 20, 2004, 08:06:53 PM
Maybe. After 2.0.0 final.
Title: Re: crazy idea
Post by: lfroen on December 21, 2004, 04:12:08 PM
nachbarnebenan: what exactly is the problem you trying to solve ? Amount of disk space  taken by uncompleted files ? You have "delete file" command on your operating system for this.

Quote
i dunno if this will work at all and how much it will affect performance

It will not. And will not affect performance, whatever you mean by this.

Quote
while locked, sources are searched with low priority

It doesn't work this way. Sources are not searched according to some priority. File priority means priority while downloading/uploading

Quote
but no upload queue place is taken

What direction are we talking about ?

Quote
the lock is opened and everything from this point on works like it does now

You mean that downloading doesn't start untill ALL parts are available ? Does it also mean, that if some of parts are no longer available (source is dropped for whatever reason), you will abort this download ? If your answer is "no" - your internal logic is broken, 'cause this file is going to become "incompleteable" by your definition.

Quote
i guess, this could decrease dead files be at least one third or more

How did you come out with number "1/3" ? Guess is not acceptable, sorry.
Detailed analysis of "how ed2k network works" shows that it will have no effect.
Title: Re: crazy idea
Post by: skolnick on December 21, 2004, 05:24:39 PM
I think the problem he is trying to solve is downloading files with no complete sources, or at least files where all the incomplete sources joint, give the complete file. This has already been implemented in some emule MODs, I think MorphXT already has it.

Regards.
Title: Re: crazy idea
Post by: nachbarnebenan on December 21, 2004, 07:38:49 PM
Yes, exactly. No file should start downloading until it's sure all parts are available (or the user overrides it). Of course, a file still can get lost if the only available source goes offline but this could give some improvement at least.
Title: Re: crazy idea
Post by: lfroen on December 21, 2004, 08:48:31 PM
WHY do you want to do it ?

Quote
No file should start downloading until it's sure all parts are available

Why it's a problem if it DOES start downloading ?
Title: Re: crazy idea
Post by: GonoszTopi on December 21, 2004, 09:30:00 PM
Quote
Why it's a problem if it DOES start downloading ?
Because there are people around the world (although in minority) who don't have unlimited disk space.
Title: Re: crazy idea
Post by: djtm on December 21, 2004, 10:18:44 PM
Yeah, it is a great Idea.

It would also help in the search section:
The servers would only display many sources (which actually have hardly anything of the file) if the file is actually available...

but mostly no space would be waisted for incomplete files.
And it should be very easy to implement.

The status should be saved: If it was completely found/started once it should keep going...

while ( available_percent != 100 ){pause_download(this);}
or better
if ( available_percent == 100 ){start_download(this);}
or similar ;)
Title: Re: crazy idea
Post by: nachbarnebenan on December 21, 2004, 11:55:02 PM
> And it should be very easy to implement.

yep, i guess it will take one of the core devs less than a minute to hack it. Nevertheless it should not be added before 2.0 to not further delay the final.
Title: Re: crazy idea
Post by: lionel77 on December 22, 2004, 03:46:48 AM
Quote
Originally posted by GonoszTopi
Quote
Why it's a problem if it DOES start downloading ?
Because there are people around the world (although in minority) who don't have unlimited disk space.
i think the even bigger issue here is to avoid wasting upload bandwidth of the clients we would be downloading from. since upload bandwidth is the most limited resource on the network, any attempt to reduce wasting it is very valuable...
Title: Re: crazy idea
Post by: lfroen on December 22, 2004, 12:52:43 PM
lionel77:
Quote
avoid wasting upload bandwidth

What does it have to do with upload ?! By pausing file in download queue how exactly you save upload bandwidth ?

GonoszTopi:
Quote
Because there are people around the world (although in minority) who don't have unlimited disk space.

When you put file in download queue, you obviously must have space for this file. The mere fact that this space is not allocated right away doesn't make it different. The space depends on size of whole file, not size of downloaded parts. Moreover, it's nice feature of ext2/3 that makes it possible to allocate only used parts, on vfat/ntfs it IS allocated right away.

djtm:
Search doesn't check parts availability. amule have to actually connect to each one of sources, and ask about parts available for each file. And there're limits about how fast it can be done, how many requests  amule can issue etc.
Title: Re: crazy idea
Post by: phoenix on December 22, 2004, 06:02:29 PM
Folks,

Dead files are a problem, but how do you know that a file is really dead? I think that the only ones capable of knowing it are the servers, that have a big up time and could do statistics about the last time a file part has been seen on the net.

If the file is not really dead, there is no bandwidth waste, because everybody is getting the existing parts. When the missing parts come up again, download continues.

Not starting a download because there are some parts missing at the moment goes against the nework philosophy in my opinion. If you really want that file, you will leave your client there asking for it a few days before dropping it from your queue. What harm does this file causes beeing on the download queue?

1) Disk space allocation
2) Source exchange bandwidth

Number 2 should be *very* small, number 1 depends upon your file system and hard disk free space.

Cheers!
Title: Re: crazy idea
Post by: lfroen on December 23, 2004, 10:06:40 AM
Quote
Not starting a download because there are some parts missing at the moment goes against the nework philosophy in my opinion. If you really want that file, you will leave your client there asking for it a few days before dropping it from your queue. What harm does this file causes beeing on the download queue?

That is exactly what I was talking about.

And there's another way: write perl script that will operate amulecmd and see available sources for each file, and call "pause" for those who have incomplete source set. And resume they later in same way.
Title: Re: crazy idea
Post by: GonoszTopi on December 23, 2004, 09:01:18 PM
Quote
And there's another way: write perl script that will operate amulecmd and see available sources for each file, and call "pause" for those who have incomplete source set. And resume they later in same way.
Not exactly. amulecmd doesn't show part info, only available sources - so you cannot determine whether the full file is available or not.
Title: Re: crazy idea
Post by: lfroen on December 26, 2004, 06:49:15 AM
So may be we should add it :)
Title: Re: crazy idea
Post by: Kry on December 26, 2004, 12:00:32 PM
...and please lfroen stop BOFHing
Title: Re: crazy idea
Post by: lfroen on December 26, 2004, 05:00:36 PM
Quote
stop BOFHing

BOFHing -?   ?(  ?(
Title: Re: crazy idea
Post by: Kry on December 26, 2004, 06:18:56 PM
trying to act like a BOFH :P

You know, we talked about that :P
Title: Re: crazy idea
Post by: phoenix on December 26, 2004, 07:15:14 PM
http://bofh.ntk.net/Bastard.html

:D
Title: Re: crazy idea
Post by: lionel77 on December 30, 2004, 05:17:21 PM
Quote
Originally posted by lfroen
lionel77:
Quote
avoid wasting upload bandwidth
What does it have to do with upload ?! By pausing file in download queue how exactly you save upload bandwidth ?
if you hadn't stopped reading my posting right there you would have understood: ;)
Quote
lionel77 wrote:
the even bigger issue here is to avoid wasting upload bandwidth of the clients we would be downloading from.

same issue is missing here:
Quote
Originally posted by phoenix
What harm does this file causes beeing on the download queue?
1) Disk space allocation
2) Source exchange bandwidth
3) Binding upload bandwidth of the clients who are uploading chunks of the dead file to us

the whole idea is that downloading chunks of dead files wastes global upload bandwidth, assuming other clients are not exclusively sharing dead files. every time you download 400mb of a dead 600mb file you have wasted 400mb of precious upload bandwidth (of other clients) that could have been used for spreading complete files.


i do see phoenix point though, that we don't want to declare a file dead that is rare but still complete and to thereby reduce it's spread even further.
you are probably right, that the servers would be much better suited for such a monitoring task, so maybe somebody should bring that idea to lugdunum's attention...
Title: Re: crazy idea
Post by: GonoszTopi on December 31, 2004, 10:38:37 AM
My idea is: we already have "Last Seen Complete" value on PartFiles. It would be very easy to add a new switch to Preferences->Files saying >>Don't start "Never Seen Complete" files<<. Implementation wouldn't be much harder, too.

(Just a suggestion, in case we decide to implement it sometimes)
Title: Re: crazy idea
Post by: Kry on December 31, 2004, 12:52:16 PM
I agree.
Title: Re: crazy idea
Post by: phoenix on January 03, 2005, 12:07:10 AM
lionel77,

I think you understood what I wrote, but you still assume that a file is dead when you say:

Quote
3) Binding upload bandwidth of the clients who are uploading chunks of the dead file to us

I think GonoszTopi suggestion is a good compromise for a "client side dead file recognition".

Cheers!
Title: Re: crazy idea
Post by: lionel77 on January 09, 2005, 10:23:34 PM
@phoenix
yep, that was meant under the assumption that the file really is dead.


implementation via the existing variable 'last seen complete' should be pretty straight-forward but i'm still undecided about the risk/benefit ratio:
we've listed most of the advantages of such a feature above.
the possible risks that i see are:
- when releasers are using chunk-hiding mods, our download would not start before every chunk has been uploaded by the releaser at least once. so our download would be slowed down in these situations. no big deal for me but other people are more impatient. in the extreme case of everyone using the described delyed download feature, releases would be impossible with such a hide-chunk mod, but again this is to unrealistic to be a real problem
- if a file is barely complete on the network and its chunks are distributed very sparsely we could fail to detect that the file is actually complete and therefore never start to download it. this would actually be pretty bad but i personally can't estimate what the likelihood of this happening would be. would this be affected by restarting amule? (i.e. does amule save info on seen chunks of a file or not?)
Title: Re: crazy idea
Post by: superstoned on January 20, 2005, 03:19:59 PM
I'd say this is a great idea.

we should think of a way to ensure not-complete files don't spread. maybe a newly added file would only start at low priority, and only increase when it is seen fully available. And don't share such files. And probably warn the user after say 7x24 hours online and say the file seems to be unavailable, and further waiting is quite pointless (remove file now [ ] do this automatically from now).

but there should be a way to ensure uncommon and new files DO get shared. Maybe first we should share and download the file, but if after one hour the file does not seem to be complete, stop sharing and downloading...