MultiGet 1.0安装问题

上网、浏览、聊天、下载等
回复
birkey
帖子: 7
注册时间: 2006-10-17 20:13
送出感谢: 0
接收感谢: 0

MultiGet 1.0安装问题

#1

帖子 birkey » 2006-11-15 9:48

工作
下载了Multiget 1.0 rpm,用alien转化成deb:
alien --script multiget-1.0-1.i386.rpm
用dpkg -i multiget-1.0-1.i386.deb 安装

问题
安装完成后,启动MultiGet,说libc库版本太低:
MultiGet: /lib/tls/i686/cmov/libc.so.6: version `GLIBC_2.4' not found (required by MultiGet)

一查系统现有的是libc2.3.5。于是去http://packages.ubuntulinux.org/edgy/base/libc6下载了glibc_2.4.orig.tar.gz,解压得到glibc-2.4。
执行./glibc-2.4/configure,一路...yes,最后得到:
configure: error: C preprocessor "/lib/cpp" fails sanity check
然后就不能make了

怀疑问题在于cpp版本,用cpp -dumpversion查看:
4.0.3
再用sudo aptitude upgrade cpp,结果:
0 packages upgraded, 0 newly installed, 0 to remove and 0 not upgraded.
Need to get 0B of archives. After unpacking 0B will be used.
不用更新!

怎么办?难道只能用为ubuntu6.06做得beta3吗?
birkey
帖子: 7
注册时间: 2006-10-17 20:13
送出感谢: 0
接收感谢: 0

解决了

#2

帖子 birkey » 2006-11-15 10:30

去下载了一个ubuntu下的静态连接好的:
http://www.linuxfans.org/nuke/modules.p ... o&did=4506
头像
glade
帖子: 1383
注册时间: 2005-10-21 16:57
送出感谢: 0
接收感谢: 0

#3

帖子 glade » 2006-11-16 0:24

天啊,何必这么辛苦呢
直接下载一个压缩包解压就能用了
头像
deng
帖子: 130
注册时间: 2006-04-09 14:09
来自: 北京
送出感谢: 0
接收感谢: 0
联系:

#4

帖子 deng » 2006-11-16 11:27

是啊,解开就能用了,一个可执行文件...
Programming is fun
==========================
http://oteam.cn
图片
头像
sysnotdown
帖子: 710
注册时间: 2006-09-24 22:43
送出感谢: 0
接收感谢: 0

#5

帖子 sysnotdown » 2006-11-18 15:08

这个rpm不是我做的,应该兼容fc5,6,不兼容ubuntu.
头像
bones7456
论坛版主
帖子: 8495
注册时间: 2006-04-12 20:05
来自: 杭州
送出感谢: 0
接收感谢: 8 次
联系:

#6

帖子 bones7456 » 2006-11-23 22:02

呵呵,始作俑者发话了嘛!
smilehjh
帖子: 15
注册时间: 2006-10-21 18:48
来自: 上海
送出感谢: 0
接收感谢: 0
联系:

[建议]

#7

帖子 smilehjh » 2006-11-23 23:16

ubuntu下的多线程下载工具:

命令行的有axel。
Axel [推荐]
Axel是命令行下的多线程下载工具,支持断点续传,速度通常情况下是Wget的几倍。
axel相对于wget来说有些链接不能下载,不过我没遇过这种情况。
基本的用法如下: #axel [选项] [下载目录] [下载地址]
[root@localhost axel-1.0a]# axel --help
Usage: axel [options] url1 [url2] [url...]

-s x Specify maximum speed (bytes per second)
-n x Specify maximum number of connections
-o f Specify local output file
-S [x] Search for mirrors and download from x servers
-N Just don't use any proxy server
-q Leave stdout alone
-v More status information
-a Alternate progress indicator
-h This information
-V Version information


一个典型下载如下(root登陆):
#axel -n 10 -o /home/yourname/download/music/ http://xxxx/xx.mp3
用户名yourname(非root)登陆的执行方式:
$sudo axel -n 10 -o /home/yourname/download/music/ http://xxxx/xx.mp3

用10线程将指定路径的文件下载到/home/yourname/download/music/这个目录下。
下载了一首3.8兆的mp3,耗时30s.
------------------------------------------
用sudo apt-get instlal d4x 就行。
使用webdownloader 也就是d4x。速度很快的。
------------------------------------------

ubuntu下的又一多线程下载工具:“puf”

puf v0.93.2a Copyright (C) 2000-2002 by Oswald Buddenhagen
based on puf v0.1.x Copyright (C) 1999,2000 by Anders Gavare
Usage: puf [options] [SPEC...]

SPEC format: URL[*disposition][^[^]proxy-URL]
URL format: [http://][user:pass@]host[.domain][:port][/path]

All options except those marked as global have effect only on the following
URLs. Their effect can be cancelled by specifying <original option>- without
any parameters possibly required by the original option or by overriding them
with another option with an opposite effect. All URL-local options can be
reverted to their default state by specifying a single comma as an argument.

What to download:
-p Download page requisites from same directory
-pr Download page requisites also from subdirectories (implies -p)
-pr+ Download page requisites from whole server (implies -pr)
-pr++ Download page requisites from whole net (implies -pr+)
-r Recurse download into subdirectories (implies -pr)
-r+ Recurse download across whole server (implies -r & -pr+)
-r++ Recurse download across whole net (implies -r+ & -pr++; caution!)
-ld NR Limit directory nesting level to NR (with -r)
-l NR Limit recursion depth to NR (with -r)
-lb NR Download only first NR bytes of every SPEC
-xg Allow recursion into URLs with ? signs (i.e., CGIs)
-ng Disallow ?-URLs, even if given on the command line
-F Treat all files as HTML (scan for links)
-B STR Prefix to add to every SPEC on the command line
-i FILE Read SPECs from FILE

What to to with existing files:
-u Update existing files, continue partial
-c Continue download of partial files
-nc Don't clobber existing files

Storage of files:
-na Don't use hostname aliases for directory names (global)
-nd Don't create subdirectories
-xd Create all subdirectories (default for -r+ & -r++)
-O STR Save next SPEC to file STR
-P STR Save files to directory STR/
-xi STR Set the name for anonymous index files (default is index.html)
-xe Enumerate files (1.puf, ...) in download order. Implies -nd
-xq Quote file names suitably for storage on FAT file systems
-xE Enumerate files in command line order. Implies -nd
-nt Don't timestamp files according to server response
-nb Delete partial files from broken downloads

Network options:
-ni Don't send "If-Range:" (assume up-to-date partial files)
-nR Don't send "Referer:"
-U STR Send "User-Agent: STR" (use "" for none)
-iU FILE Choose User-Agent strings from FILE
-xH STR Add arbitarary header STR to HTTP requests
-Tl NR Set DNS lookup timeout to NR seconds (global; default is 60)
-Tc NR Set connect timeout to NR seconds (default is 60)
-Td NR Set data timeout to NR seconds (default is 120)
-t NR Set maximum number of download attempts per URL (default is 5)
-nw Don't wait before connecting a busy/dead host
-xb IP Bind outgoing connections to IP
-ib FILE Bind outgoing connections to random IPs from FILE
-y PRX Use proxy PRX. Multiple -y's are allowed
-yi FILE Read proxies from FILE. PRX format: URL[*load ratio]

Resource usage quotas (global):
-Q NR Abort puf after NR bytes (unlimited by default)
-Qu NR Abort puf after NR URLs (unlimited by default)
-Qt NR Abort puf after NR seconds (unlimited by default)
-lc NR Max NR simultaneous connections (default is 20)
-ll NR Max NR simultaneous DNS lookups (default is 10)
-nf Use fewer file descriptors. Slightly slower
-nh Do fewer DNS lookups. May miss some references

Logging (global):
-ns Disable download progress statistics
-v Be verbose (show errors). Implies -ns
-vv Be very verbose (show warnings). Implies -v
-vvv Be extremely verbose (show infos). Implies -vv
-d NR Debug: URL=1 DNS=2 QUE=4 CON=8 HDR=16 CHK=32 MEM=64
-h This help screen

Example:
puf -P stuff -r+ www.foo.com -r www.bar.com -r- www.some.org , www.blub.de
--------------------------------------------
一个普通创造者!
帮助你从win转到ubuntu: http://forum.ubuntu.org.cn/viewtopic.php?t=34081
回复

回到 “因特网相关软件”