From: Sean DiZazzo on
On May 13, 9:54 pm, Sean DiZazzo <half.ital...(a)gmail.com> wrote:
> On May 13, 9:39 am, News123 <news1...(a)free.fr> wrote:
>
>
>
> > Hi Aaaz,
>
> > Aahz wrote:
> > > In article <4bea6b50$0$8925$426a7...(a)news.free.fr>,
> > > News123  <news1...(a)free.fr> wrote:
> > >> I'd like to perform huge file uploads via https.
> > >> I'd like to make sure,
> > >> - that I can obtain upload progress info (sometimes the nw is very slow)
> > >> - that (if the file exceeds a certain size) I don't have to
> > >>  read the entire file into RAM.
>
> > > Based on my experience with this, you really need to send multiple
> > > requests (i.e. "chunking").  There are ways around this (you can look
> > > into curl's resumable uploads), but you will need to maintain state no
> > > matter what, and I think that chunking is the best/simplest.
>
> > I agree I need  chunking. (the question is just on which level of the
> > protocol)
>
> > I just don't know how to make a chunkwise file upload or what library is
> > best.
>
> > Can you recommend any libraries or do you have a link to an example?
>
> > I'd like to avoid to make separate https post requests for the chunks
> > (at least if the underlying module does NOT support keep-alive connections)
>
> > I made some tests with high level chunking (separate sequential https
> > post requests).
> > What I noticed is a rather high penalty in data throughput.
> > The reason is probably, that each request makes its own https connection
> > and that either the NW driver or the TCP/IP stack doesn't allocate
> > enough band width to my request.
>
> > Therefore I'd like to do the chunking on a 'lower' level.
> > One option would be to have a https module, which supports keep-alive,
>
> > the other would be  to have a library, which creates a http post body
> > chunk by chunk.
>
> > What do others do for huge file uploads
> > The uploader might be connected via ethernet, WLAN, UMTS, EDGE, GPRS. )
>
> > N
>
> You could also just send the file in one big chunk and give yourself
> another avenue to read the size of the file on the server.  Maybe a
> webservice that you call with the name of the file that returns it's
> percent complete, or it could just return bytes on disk and you do the
> math on the client side.  Then you just forget about the transfer and
> query the file size whenever you want to know...or on a schedule.
>
> ~Sean

oops...that doesn't help with the other requirements. My suggestion
is to not use https. I don't think it was created to move around
large pieces of data. Lots of small pieces rather. SFTP?
From: J.O. Aho on
News123 <news1234(a)free.fr> wrote:

> What do others do for huge file uploads
> The uploader might be connected via ethernet, WLAN, UMTS, EDGE, GPRS. )

Those cases where I have had to move big files it's been scp on those cases
where you just have to push a new file, in cases where it's a question of
keeping two directories synced, then it's rsync over ssh.
The later one I have never done in python.


--

//Aho
From: News123 on
Hi Sean,




Sean DiZazzo wrote:
> On May 13, 9:54 pm, Sean DiZazzo <half.ital...(a)gmail.com> wrote:
>> On May 13, 9:39 am, News123 <news1...(a)free.fr> wrote:
>>
>>
>>
>>> Hi Aaaz,
>>> Aahz wrote:
>>>> In article <4bea6b50$0$8925$426a7...(a)news.free.fr>,
>>>> News123 <news1...(a)free.fr> wrote:
>>>>> I'd like to perform huge file uploads via https.
>>>>> I'd like to make sure,

>
> oops...that doesn't help with the other requirements. My suggestion
> is to not use https. I don't think it was created to move around
> large pieces of data. Lots of small pieces rather. SFTP?


I had to check, but I guess sftp is not exactly suitable for my usecase.

My problem
- the whole communication is to be intended to work like a drop box.
- one can upload files
- one can not see, what one has uploaded before
- no way to accidentally overwrite a previous upload, etc.
- I don't know enough about sftp servers to know how I could configure
it to act as a drop box.


That's much easier to hide behind an https server than behind an out of
the box sftp server.



N
From: News123 on
Hi James,

James Mills wrote:
> On Wed, May 12, 2010 at 6:48 PM, News123 <news1234(a)free.fr> wrote:
>> Hi,
>>
>> I'd like to perform huge file uploads via https.
>> I'd like to make sure,
>> - that I can obtain upload progress info (sometimes the nw is very slow)
>> - that (if the file exceeds a certain size) I don't have to
>> read the entire file into RAM.
>>
>
> My suggestion is to find some tools that can
> send multiple chucks of data. A non-blocking
> i/o library/tool might be useful here (eg: twisted or similar).
>

I never used twisted so far.
Perhaps the time to look at it.


bye


N
From: News123 on
Hi J,


J.O. Aho wrote:
> News123 <news1234(a)free.fr> wrote:
>
>> What do others do for huge file uploads
>> The uploader might be connected via ethernet, WLAN, UMTS, EDGE, GPRS. )
>
> Those cases where I have had to move big files it's been scp on those cases
> where you just have to push a new file, in cases where it's a question of
> keeping two directories synced, then it's rsync over ssh.
> The later one I have never done in python.


I agree. From home this is also what I do.
scp / rsync.


However I'd like to use https, as http/https are the two ports, that are
almost everywhere accessible (eve with proxies / firewalls, etc.)



N