当我运行更新时,出现以下错误。
GPG error: http://cran.wustl.edu maverick/ Release: The following signatures couldn't be verified because the public key is not available: NO_PUBKEY 51716619E084DAB9
Run Code Online (Sandbox Code Playgroud)
因此,我运行了以下代码并收到以下错误消息:
$ gpg --keyserver subkeys.pgp.net --recv 51716619E084DAB9
gpg: requesting key E084DAB9 from hkp server subkeys.pgp.net
gpg: key E084DAB9: "Michael Rutter <marutter@gmail.com>" not changed
gpg: Total number processed: 1
gpg: unchanged: 1
Run Code Online (Sandbox Code Playgroud)
我如何设置一个有效的公钥或者我可以做些什么来解决这个问题?
我正在尝试使用以下安装方式安装 Wordpress。但是,当我尝试打开浏览器来设置 Wordpress 时,我收到一条错误消息,指出建立数据库连接时出现问题。我认为这是因为运行chown命令时出现错误。
http://www.techkaki.com/2011/04/how-to-install-wordpress-locally-on-ubuntu-10-10-with-lamp/
chown -R www-data /var/www/wordpress
Run Code Online (Sandbox Code Playgroud)
我收到大量错误消息:
...
chown: changing ownership of `/var/www/wordpress/wp-admin/network/themes.php': Operation not permitted
chown: changing ownership of `/var/www/wordpress/wp-admin/network/users.php': Operation not permitted
chown: changing ownership of `/var/www/wordpress/wp-admin/network/index.php': Operation not permitted
chown: changing ownership of `/var/www/wordpress/wp-admin/network/sites.php': Operation not permitted
chown: changing ownership of `/var/www/wordpress/wp-admin/network/user-new.php': Operation not permitted
chown: changing ownership of `/var/www/wordpress/wp-admin/network/setup.php': Operation not permitted
chown: changing ownership of `/var/www/wordpress/wp-admin/network/theme-install.php': Operation not permitted
chown: changing ownership of `/var/www/wordpress/wp-admin/network/plugins.php': Operation not permitted
Run Code Online (Sandbox Code Playgroud)
有谁知道这是怎么回事?
Ubuntu 和 Latex 新手在这里。我有一个非常基本的问题。
我最近使用以下代码安装了 Latex:
sudo apt-get install texlive
Run Code Online (Sandbox Code Playgroud)
但是,我想卸载它并暂时获得乳胶基础。
我尝试了以下但它不起作用:
sudo apt-get remove texlive
Run Code Online (Sandbox Code Playgroud)
dpkg --get-selections | grep tex 显示安装了许多与乳胶相关的文件。
如何删除texlive?
我在 Ubuntu 10.10 中工作,并试图在 Hadoop 中启动一个单节点集群。
hadoop@abraham-Dimension-3000:/usr/local/hadoop$ bin/start-all.sh
mkdir: cannot create directory `/usr/local/hadoop/bin/../logs': Permission denied
starting namenode, logging to /usr/local/hadoop/bin/../logs/hadoop-hadoop-namenode-abraham-Dimension-3000.out
/usr/local/hadoop/bin/hadoop-daemon.sh: line 117: /usr/local/hadoop/bin/../logs/hadoop-hadoop-namenode-abraham-Dimension-3000.out: No such file or directory
head: cannot open `/usr/local/hadoop/bin/../logs/hadoop-hadoop-namenode-abraham-Dimension-3000.out' for reading: No such file or directory
localhost: mkdir: cannot create directory `/usr/local/hadoop/bin/../logs': Permission denied
localhost: starting datanode, logging to /usr/local/hadoop/bin/../logs/hadoop-hadoop-datanode-abraham-Dimension-3000.out
localhost: /usr/local/hadoop/bin/hadoop-daemon.sh: line 117: /usr/local/hadoop/bin/../logs/hadoop-hadoop-datanode-abraham-Dimension-3000.out: No such file or directory
localhost: head: cannot open `/usr/local/hadoop/bin/../logs/hadoop-hadoop-datanode-abraham-Dimension-3000.out' for reading: No such file or directory
localhost: mkdir: cannot create …Run Code Online (Sandbox Code Playgroud) 我是 Ubuntu 新手,一直在尝试安装 python 的 easy_install,以便在安装 Python 库时不必处理源文件。
我已经运行了以下命令,它似乎安装了正确的应用程序:
sudo apt-get install python-setuptools
Run Code Online (Sandbox Code Playgroud)
但是,当我运行easy_install sqlalchemyor 时easy_install pysqlite3,它不起作用。
我收到以下错误消息:
install_dir /usr/local/lib/python2.6/dist-packages/
error: can't create or remove files in install directory
The following error occurred while trying to add or remove files in the
installation directory:
[Errno 13] Permission denied: '/usr/local/lib/python2.6/dist-packages/test-easy-install-1674.pth'
The installation directory you specified (via --install-dir, --prefix, or
the distutils default setting) was:
/usr/local/lib/python2.6/dist-packages/
Perhaps your account does not have write access to this directory? If …Run Code Online (Sandbox Code Playgroud) 我目前正在双启动到 Ubuntu 10.10 或 Windows XP。但是,我很少使用Windows,并且想将其删除。
我已经使用 Ubuntu live CD 访问 GParted。我还删除了 windows 分区(我认为),它们被标记为 ntfs 和 fat32。
我只是不确定如何扩展 Ubuntu 分区以覆盖未分配的空间。此外,我如何处理表示分区繁忙的“键”指定。
帮助!
大家好,我还有一个非常基本的问题。请多多包涵。
我正在按照以下网站上的说明在我的计算机上下载和配置 Hadoop。
http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
当我使用 localhost 命令时,我在最后的 SSH 部分。当我运行以下代码时,我收到一条错误消息。
hadoop@amathew-Dimension-3000:~$ cat $HOME/.ssh/id_rsa.pub >> $HOME/.ssh/authorized_keys
hadoop@amathew-Dimension-3000:~$ ssh localhost
ssh: connect to host localhost port 22: Connection refused
Run Code Online (Sandbox Code Playgroud)
以下是一些可能相关的其他信息。
hadoop@amathew-Dimension-3000:~$ ssh -vvv localhost
OpenSSH_5.5p1 Debian-4ubuntu5, OpenSSL 0.9.8o 01 Jun 2010
debug1: Reading configuration data /etc/ssh/ssh_config
debug1: Applying options for *
debug2: ssh_connect: needpriv 0
debug1: Connecting to localhost [::1] port 22.
debug1: connect to address ::1 port 22: Connection refused
debug1: Connecting to localhost [127.0.0.1] port 22.
debug1: connect to address 127.0.0.1 …Run Code Online (Sandbox Code Playgroud) 我正在使用以下站点在 Ubuntu 10.10 中安装和配置 Hadoop http://arifn.web.id/blog/2010/07/29/running-hadoop-single-cluster.html
但是,当我尝试格式化 Hadoop 文件系统时,出现以下错误。
amathew@amathew-Dimension-3000:~$ cd /usr/local/hadoop
amathew@amathew-Dimension-3000:/usr/local/hadoop$ bin/hadoop namenode -format
11/04/16 21:23:07 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG: host = amathew-Dimension-3000/192.168.1.66
STARTUP_MSG: args = [-format]
STARTUP_MSG: version = 0.20.2
STARTUP_MSG: build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20 -r 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC 2010
************************************************************/
11/04/16 21:23:08 INFO namenode.FSNamesystem: fsOwner=amathew,amathew,adm,dialout,cdrom,plugdev,lpadmin,admin,sambashare
11/04/16 21:23:08 INFO namenode.FSNamesystem: supergroup=supergroup
11/04/16 21:23:08 INFO namenode.FSNamesystem: isPermissionEnabled=true
11/04/16 21:23:08 ERROR namenode.NameNode: java.io.IOException: Cannot create directory /usr/local/hadoop-datastore/hadoop/dfs/name/current
at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:295) …Run Code Online (Sandbox Code Playgroud)