当前时区为 UTC + 8 小时



发表新帖 回复这个主题  [ 7 篇帖子 ] 
作者 内容
1 楼 
 文章标题 : MultiGet 1.0安装问题
帖子发表于 : 2006-11-15 9:48 

注册: 2006-10-17 20:13
帖子: 7
送出感谢: 0 次
接收感谢: 0 次
工作
下载了Multiget 1.0 rpm,用alien转化成deb:
alien --script multiget-1.0-1.i386.rpm
用dpkg -i multiget-1.0-1.i386.deb 安装

问题
安装完成后,启动MultiGet,说libc库版本太低:
MultiGet: /lib/tls/i686/cmov/libc.so.6: version `GLIBC_2.4' not found (required by MultiGet)

一查系统现有的是libc2.3.5。于是去http://packages.ubuntulinux.org/edgy/base/libc6下载了glibc_2.4.orig.tar.gz,解压得到glibc-2.4。
执行./glibc-2.4/configure,一路...yes,最后得到:
configure: error: C preprocessor "/lib/cpp" fails sanity check
然后就不能make了

怀疑问题在于cpp版本,用cpp -dumpversion查看:
4.0.3
再用sudo aptitude upgrade cpp,结果:
0 packages upgraded, 0 newly installed, 0 to remove and 0 not upgraded.
Need to get 0B of archives. After unpacking 0B will be used.
不用更新!

怎么办?难道只能用为ubuntu6.06做得beta3吗?


页首
 用户资料  
 
2 楼 
 文章标题 : 解决了
帖子发表于 : 2006-11-15 10:30 

注册: 2006-10-17 20:13
帖子: 7
送出感谢: 0 次
接收感谢: 0 次
去下载了一个ubuntu下的静态连接好的:
http://www.linuxfans.org/nuke/modules.p ... o&did=4506


页首
 用户资料  
 
3 楼 
 文章标题 :
帖子发表于 : 2006-11-16 0:24 
头像

注册: 2005-10-21 16:57
帖子: 1383
送出感谢: 0 次
接收感谢: 0 次
天啊,何必这么辛苦呢
直接下载一个压缩包解压就能用了


页首
 用户资料  
 
4 楼 
 文章标题 :
帖子发表于 : 2006-11-16 11:27 
头像

注册: 2006-04-09 14:09
帖子: 130
地址: 北京
送出感谢: 0 次
接收感谢: 0 次
是啊,解开就能用了,一个可执行文件...


_________________
Programming is fun
==========================
http://oteam.cn
图片


页首
 用户资料  
 
5 楼 
 文章标题 :
帖子发表于 : 2006-11-18 15:08 
头像

注册: 2006-09-24 22:43
帖子: 710
送出感谢: 0 次
接收感谢: 0 次
这个rpm不是我做的,应该兼容fc5,6,不兼容ubuntu.


页首
 用户资料  
 
6 楼 
 文章标题 :
帖子发表于 : 2006-11-23 22:02 
头像

注册: 2006-04-12 20:05
帖子: 8495
地址: 杭州
送出感谢: 0 次
接收感谢: 8
呵呵,始作俑者发话了嘛!


页首
 用户资料  
 
7 楼 
 文章标题 : [建议]
帖子发表于 : 2006-11-23 23:16 

注册: 2006-10-21 18:48
帖子: 15
地址: 上海
送出感谢: 0 次
接收感谢: 0 次
ubuntu下的多线程下载工具:

命令行的有axel。
Axel [推荐]
Axel是命令行下的多线程下载工具,支持断点续传,速度通常情况下是Wget的几倍。
axel相对于wget来说有些链接不能下载,不过我没遇过这种情况。
基本的用法如下: #axel [选项] [下载目录] [下载地址]
[root@localhost axel-1.0a]# axel --help
Usage: axel [options] url1 [url2] [url...]

-s x Specify maximum speed (bytes per second)
-n x Specify maximum number of connections
-o f Specify local output file
-S [x] Search for mirrors and download from x servers
-N Just don't use any proxy server
-q Leave stdout alone
-v More status information
-a Alternate progress indicator
-h This information
-V Version information


一个典型下载如下(root登陆):
#axel -n 10 -o /home/yourname/download/music/ http://xxxx/xx.mp3
用户名yourname(非root)登陆的执行方式:
$sudo axel -n 10 -o /home/yourname/download/music/ http://xxxx/xx.mp3

用10线程将指定路径的文件下载到/home/yourname/download/music/这个目录下。
下载了一首3.8兆的mp3,耗时30s.
------------------------------------------
用sudo apt-get instlal d4x 就行。
使用webdownloader 也就是d4x。速度很快的。
------------------------------------------

ubuntu下的又一多线程下载工具:“puf”

puf v0.93.2a Copyright (C) 2000-2002 by Oswald Buddenhagen
based on puf v0.1.x Copyright (C) 1999,2000 by Anders Gavare
Usage: puf [options] [SPEC...]

SPEC format: URL[*disposition][^[^]proxy-URL]
URL format: [http://][user:pass@]host[.domain][:port][/path]

All options except those marked as global have effect only on the following
URLs. Their effect can be cancelled by specifying <original option>- without
any parameters possibly required by the original option or by overriding them
with another option with an opposite effect. All URL-local options can be
reverted to their default state by specifying a single comma as an argument.

What to download:
-p Download page requisites from same directory
-pr Download page requisites also from subdirectories (implies -p)
-pr+ Download page requisites from whole server (implies -pr)
-pr++ Download page requisites from whole net (implies -pr+)
-r Recurse download into subdirectories (implies -pr)
-r+ Recurse download across whole server (implies -r & -pr+)
-r++ Recurse download across whole net (implies -r+ & -pr++; caution!)
-ld NR Limit directory nesting level to NR (with -r)
-l NR Limit recursion depth to NR (with -r)
-lb NR Download only first NR bytes of every SPEC
-xg Allow recursion into URLs with ? signs (i.e., CGIs)
-ng Disallow ?-URLs, even if given on the command line
-F Treat all files as HTML (scan for links)
-B STR Prefix to add to every SPEC on the command line
-i FILE Read SPECs from FILE

What to to with existing files:
-u Update existing files, continue partial
-c Continue download of partial files
-nc Don't clobber existing files

Storage of files:
-na Don't use hostname aliases for directory names (global)
-nd Don't create subdirectories
-xd Create all subdirectories (default for -r+ & -r++)
-O STR Save next SPEC to file STR
-P STR Save files to directory STR/
-xi STR Set the name for anonymous index files (default is index.html)
-xe Enumerate files (1.puf, ...) in download order. Implies -nd
-xq Quote file names suitably for storage on FAT file systems
-xE Enumerate files in command line order. Implies -nd
-nt Don't timestamp files according to server response
-nb Delete partial files from broken downloads

Network options:
-ni Don't send "If-Range:" (assume up-to-date partial files)
-nR Don't send "Referer:"
-U STR Send "User-Agent: STR" (use "" for none)
-iU FILE Choose User-Agent strings from FILE
-xH STR Add arbitarary header STR to HTTP requests
-Tl NR Set DNS lookup timeout to NR seconds (global; default is 60)
-Tc NR Set connect timeout to NR seconds (default is 60)
-Td NR Set data timeout to NR seconds (default is 120)
-t NR Set maximum number of download attempts per URL (default is 5)
-nw Don't wait before connecting a busy/dead host
-xb IP Bind outgoing connections to IP
-ib FILE Bind outgoing connections to random IPs from FILE
-y PRX Use proxy PRX. Multiple -y's are allowed
-yi FILE Read proxies from FILE. PRX format: URL[*load ratio]

Resource usage quotas (global):
-Q NR Abort puf after NR bytes (unlimited by default)
-Qu NR Abort puf after NR URLs (unlimited by default)
-Qt NR Abort puf after NR seconds (unlimited by default)
-lc NR Max NR simultaneous connections (default is 20)
-ll NR Max NR simultaneous DNS lookups (default is 10)
-nf Use fewer file descriptors. Slightly slower
-nh Do fewer DNS lookups. May miss some references

Logging (global):
-ns Disable download progress statistics
-v Be verbose (show errors). Implies -ns
-vv Be very verbose (show warnings). Implies -v
-vvv Be extremely verbose (show infos). Implies -vv
-d NR Debug: URL=1 DNS=2 QUE=4 CON=8 HDR=16 CHK=32 MEM=64
-h This help screen

Example:
puf -P stuff -r+ www.foo.com -r www.bar.com -r- www.some.org , www.blub.de
--------------------------------------------


_________________
一个普通创造者!
帮助你从win转到ubuntu: http://forum.ubuntu.org.cn/viewtopic.php?t=34081


页首
 用户资料  
 
显示帖子 :  排序  
发表新帖 回复这个主题  [ 7 篇帖子 ] 

当前时区为 UTC + 8 小时


在线用户

正在浏览此版面的用户:没有注册用户 和 4 位游客


不能 在这个版面发表主题
不能 在这个版面回复主题
不能 在这个版面编辑帖子
不能 在这个版面删除帖子
不能 在这个版面提交附件

前往 :  
本站点为公益性站点,用于推广开源自由软件,由 DiaHosting VPSBudgetVM VPS 提供服务。
我们认为:软件应可免费取得,软件工具在各种语言环境下皆可使用,且不会有任何功能上的差异;
人们应有定制和修改软件的自由,且方式不受限制,只要他们自认为合适。

Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group
简体中文语系由 王笑宇 翻译