Lomadee, uma nova espécie na web. A maior plataforma de afiliados da América Latina.

terça-feira, março 27, 2012

Connect with local customers online

Veja como você pode aumentar suas chances de fazer negócios com clientes em potencial locais com apenas alguns ajustes a suas páginas.

quinta-feira, março 15, 2012

How to circumvent files download limits


I was sitting on a product training class during this week and there ware a proxy restriction to the file download sizes. This can be avoided by using something like TOR or any VPN software, but I wanted something I had available at that moment: a Linux server and a Mac client.

First of all, I have tried to do it by using curl, but the server I was trying to get files from had it's range download disabled (or it was the proxy itself). Very simple:

$ curl --remote-name --continue-at - --range -1024 url

You will have to run this command as many times as the number of multiple parts in bytes the file have. You can try bigger ranges to reduce the number of times you need to run the command.

But, if the web server or proxy have it's range download disabled, it will not work, so you need a remote linux server. At the remote linux server, download the file and use split to get it in smaller parts, so you can download it and bring it together again.

$ split --bytes=5mB filename

It will break the file at 5mB chunks (you need to know the max allowed download size of the network). It will generate files in the form of xaX where X is between a and z (or xaXX if you are dealing with bigger files with small parts).

At the client computer (linux, mac) you need a simple command to download and one to join the files together:

$ for f in `jot -w xa%c 5 a` ; do curl --remote-name url/$f ; done

The jot command will generate a sequence of strings xaa,xab,xac,xad,xae that will be feed in curl one by one to get the remote files. When it is done, join the files together using cat:

$ cat xa* > originalfilename

That's it, big files downloaded inside a restricted network.