小编pet*_*man的帖子

Omniauth-facebook不断报告invalid_credentials

我正在尝试按照Railscast#360中的描述实现omniauth-facebook,并遇到了相当大的障碍.当我点击登录链接时,我得到了所需的弹出窗口,要求我输入我的facebook凭据,但是当我提交时,我得到一个OmniAuth :: Strategies :: OAuth2 :: CallbackError错误.在apache日志中,打印出来:(facebook)身份验证失败!invalid_credentials:OmniAuth :: Strategies :: OAuth2 :: CallbackError,OmniAuth :: Strategies :: OAuth2 :: CallbackError

这是相关代码:

omn​​iauth.rb

OmniAuth.config.logger = Rails.logger

Rails.application.config.middleware.use OmniAuth::Builder do
  provider :facebook, ENV['FACEBOOK_APP_ID'], ENV['FACEBOOK_SECRET']
end
Run Code Online (Sandbox Code Playgroud)

sessions_controller.rb

class SessionsController < ApplicationController
  def create
    user = User.from_omniauth(env["omniauth.auth"])
    session[:user_id] = user.id
    redirect_to root_url
  end

  def destroy
    session[:user_id] = nil
    redirect_to root_url
  end
end
Run Code Online (Sandbox Code Playgroud)

application.html.erb

<div id="fb-root"></div>
<script>        
window.fbAsyncInit = function() {
    FB.init({
        appId      : '(**my app id**)', // App ID
        status     : true, // check login …
Run Code Online (Sandbox Code Playgroud)

facebook ruby-on-rails omniauth

28
推荐指数
2
解决办法
2万
查看次数

使用Spark进行HBase流式传输不可序列化

我正在尝试使用Spark从HBase流式传输数据.当我运行scala脚本时,这是我得到的错误:

ERROR Executor: Exception in task 0.0 in stage 10.0 (TID 10)
java.io.NotSerializableException: org.apache.hadoop.hbase.io.ImmutableBytesWritable
Run Code Online (Sandbox Code Playgroud)

我一开始认为我的数据格式不正确,所以我尝试创建一个只有一行的基本表:

row1 column=fam1:c1, timestamp=1422306700801, value=abc
Run Code Online (Sandbox Code Playgroud)

即使有了这一行,我仍然会得到同样的错误.有什么明显我想念的吗?这是脚本:

def convertScanToString(scan: Scan): String = {
  val out: ByteArrayOutputStream = new ByteArrayOutputStream
  val dos: DataOutputStream = new DataOutputStream(out)
  scan.write(dos)
  Base64.encodeBytes(out.toByteArray)
}

val conf = HBaseConfiguration.create()
val scan = new Scan()
scan.setCaching(500)
scan.setCacheBlocks(false)
conf.set(TableInputFormat.INPUT_TABLE, "test_table")
conf.set(TableInputFormat.SCAN, convertScanToString(scan))
val rdd = sc.newAPIHadoopRDD(conf, classOf[TableInputFormat], classOf[ImmutableBytesWritable], classOf[Result])
rdd.first
Run Code Online (Sandbox Code Playgroud)

编辑:根据要求,这是完整的堆栈跟踪

15/01/26 21:50:50 ERROR Executor: Exception in task 0.0 in stage 14.0 (TID 14)
java.io.NotSerializableException: org.apache.hadoop.hbase.io.ImmutableBytesWritable …
Run Code Online (Sandbox Code Playgroud)

hbase scala apache-spark

2
推荐指数
1
解决办法
3426
查看次数

标签 统计

apache-spark ×1

facebook ×1

hbase ×1

omniauth ×1

ruby-on-rails ×1

scala ×1