我在Ubuntu 18上在numpy中分配大型数组时遇到了一个问题,而在MacOS上却没有遇到同样的问题。
我想一个numpy的阵列形状分配内存(156816, 36, 53806)
使用
np.zeros((156816, 36, 53806), dtype='uint8')
Run Code Online (Sandbox Code Playgroud)
当我在Ubuntu OS上遇到错误时
>>> import numpy as np
>>> np.zeros((156816, 36, 53806), dtype='uint8')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
numpy.core._exceptions.MemoryError: Unable to allocate array with shape (156816, 36, 53806) and data type uint8
Run Code Online (Sandbox Code Playgroud)
我在MacOS上没有得到它:
>>> import numpy as np
>>> np.zeros((156816, 36, 53806), dtype='uint8')
array([[[0, 0, 0, ..., 0, 0, 0],
[0, 0, 0, ..., 0, 0, 0],
[0, 0, 0, ..., 0, 0, 0], …Run Code Online (Sandbox Code Playgroud) 我有UTC和ISO8601的时间戳,但使用结构化流,它会自动转换为本地时间.有没有办法阻止这种转换?我想在UTC中使用它.
我正在从Kafka读取json数据,然后使用from_jsonSpark函数解析它们.
输入:
{"Timestamp":"2015-01-01T00:00:06.222Z"}
Run Code Online (Sandbox Code Playgroud)
流:
SparkSession
.builder()
.master("local[*]")
.appName("my-app")
.getOrCreate()
.readStream()
.format("kafka")
... //some magic
.writeStream()
.format("console")
.start()
.awaitTermination();
Run Code Online (Sandbox Code Playgroud)
架构:
StructType schema = DataTypes.createStructType(new StructField[] {
DataTypes.createStructField("Timestamp", DataTypes.TimestampType, true),});
Run Code Online (Sandbox Code Playgroud)
输出:
+--------------------+
| Timestamp|
+--------------------+
|2015-01-01 01:00:...|
|2015-01-01 01:00:...|
+--------------------+
Run Code Online (Sandbox Code Playgroud)
如您所见,小时数自动增加.
PS:我试着尝试from_utc_timestampSpark功能,但没有运气.
java scala apache-spark apache-spark-sql spark-structured-streaming
我正在尝试将图像 src 动态绑定到表单的 URL ../assets/project_screens/image_name.jpg。
我的目录结构如下:
- src
-- assets
---- project_screens
-- profile
---- ProjectList.vue
-- store
---- profile.js
Run Code Online (Sandbox Code Playgroud)
我知道我需要使用 webpackrequire("path/to/image")来帮助 webpack 构建这些图像,但是路径没有解析。
- src
-- assets
---- project_screens
-- profile
---- ProjectList.vue
-- store
---- profile.js
Run Code Online (Sandbox Code Playgroud)
// store/profile.js
projects = [
{
....
},
{
....
img_url: "../assets/project_screens/im1.jpg"
....
}
....
]
Run Code Online (Sandbox Code Playgroud)
如果我使用 URL 字符串而不是动态传递 URL 字符串,它会起作用。环顾堆栈溢出,没有任何解决方案有帮助。我还缺少其他东西吗?
我在 React Native 应用程序中使用 react-navigation。
我不断收到一个错误,该错误应该是仅用于开发的警告,不会在生产中显示。
我如何修复下面的错误?
console.error: "The action 'NAVIGATE' with payload
{"name":"192.168.100.189:19000","params":{}} was not handled by any
navigator.
Do you have a screen named '192.168.100.189:19000'?
If you'r trying to navigate to a screen in a nested navigator, see
https://reactnavigation.org/docs/nesting-navigators#navigating-to-a-screen-in-a-nestd-navigator.
This is a development-only warning and won't be shown in production."
Run Code Online (Sandbox Code Playgroud) SparkSession
.builder
.master("local[*]")
.config("spark.sql.warehouse.dir", "C:/tmp/spark")
.config("spark.sql.streaming.checkpointLocation", "C:/tmp/spark/spark-checkpoint")
.appName("my-test")
.getOrCreate
.readStream
.schema(schema)
.json("src/test/data")
.cache
.writeStream
.start
.awaitTermination
Run Code Online (Sandbox Code Playgroud)
在Spark 2.1.0中执行此示例时出现错误.没有.cache选项,它按预期工作,但.cache我有选择:
Exception in thread "main" org.apache.spark.sql.AnalysisException: Queries with streaming sources must be executed with writeStream.start();;
FileSource[src/test/data]
at org.apache.spark.sql.catalyst.analysis.UnsupportedOperationChecker$.org$apache$spark$sql$catalyst$analysis$UnsupportedOperationChecker$$throwError(UnsupportedOperationChecker.scala:196)
at org.apache.spark.sql.catalyst.analysis.UnsupportedOperationChecker$$anonfun$checkForBatch$1.apply(UnsupportedOperationChecker.scala:35)
at org.apache.spark.sql.catalyst.analysis.UnsupportedOperationChecker$$anonfun$checkForBatch$1.apply(UnsupportedOperationChecker.scala:33)
at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:128)
at org.apache.spark.sql.catalyst.analysis.UnsupportedOperationChecker$.checkForBatch(UnsupportedOperationChecker.scala:33)
at org.apache.spark.sql.execution.QueryExecution.assertSupported(QueryExecution.scala:58)
at org.apache.spark.sql.execution.QueryExecution.withCachedData$lzycompute(QueryExecution.scala:69)
at org.apache.spark.sql.execution.QueryExecution.withCachedData(QueryExecution.scala:67)
at org.apache.spark.sql.execution.QueryExecution.optimizedPlan$lzycompute(QueryExecution.scala:73)
at org.apache.spark.sql.execution.QueryExecution.optimizedPlan(QueryExecution.scala:73)
at org.apache.spark.sql.execution.QueryExecution.sparkPlan$lzycompute(QueryExecution.scala:79)
at org.apache.spark.sql.execution.QueryExecution.sparkPlan(QueryExecution.scala:75)
at org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:84)
at org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:84)
at org.apache.spark.sql.execution.CacheManager$$anonfun$cacheQuery$1.apply(CacheManager.scala:102)
at org.apache.spark.sql.execution.CacheManager.writeLock(CacheManager.scala:65)
at org.apache.spark.sql.execution.CacheManager.cacheQuery(CacheManager.scala:89)
at org.apache.spark.sql.Dataset.persist(Dataset.scala:2479)
at org.apache.spark.sql.Dataset.cache(Dataset.scala:2489)
at org.me.App$.main(App.scala:23)
at org.me.App.main(App.scala)
Run Code Online (Sandbox Code Playgroud)
任何的想法?
scala apache-spark apache-spark-sql apache-spark-2.0 spark-structured-streaming
大家好,我正在尝试在 npm 中发布我的角度库,但是当我登录时,我得到了这个:
npm ERR! code EAI_AGAIN
npm ERR! errno EAI_AGAIN
npm ERR! request to http://registry.npmjs.org/-/user/org.couchdb.user:belzee10 failed, reason: getaddrinfo EAI_AGAIN registry.npmjs.org:80
npm ERR! A complete log of this run can be found in:
npm ERR! C:\Users\belzee\AppData\Roaming\npm-cache\_logs\2018-01-08T16_03_35_050Z-debug.log
Run Code Online (Sandbox Code Playgroud)
版本
node: 8.9.3
npm: 5.5.1
Run Code Online (Sandbox Code Playgroud)
我位于经过身份验证的代理后面,并且我已经配置: proxy 和 https-proxy
npm config set proxy http: // Username: Pa55w0rd @ proxyhostname: port
npm config set https-proxy http: // Username: Pa55w0rd @ proxyhostname: port
Run Code Online (Sandbox Code Playgroud)
感谢您的关注
我想创建一个 Apache 容器并将当前工作目录挂载为容器中的一个卷,所以我有这个代码:
volumes:
- ${DOCUMENT_ROOT}:/var/www/html
Run Code Online (Sandbox Code Playgroud)
${DOCUMENT_ROOT} 的值是文件中的一个点.env:
DOCUMENT_ROOT=.
Run Code Online (Sandbox Code Playgroud)
我的docker-compose.yml文件位于我的项目目录的根目录,在我的项目目录中有一个.docker目录。
我试过这 3 行:
volumes:
- .:/var/www/html
volumes:
- ./:/var/www/html
volumes:
- ${DOCUMENT_ROOT}:/var/www/html
Run Code Online (Sandbox Code Playgroud)
但我有这个错误:
Creating 7.4.x-webserver ... error ERROR: for 7.4.x-webserver Cannot
create container for service webserver: b'create .: volume name is too
short, names should be at least two alphanumeric characters'
ERROR: for webserver Cannot create container for service webserver:
b'create .: volume name is too short, names should be at least two
alphanumeric …Run Code Online (Sandbox Code Playgroud) 我使用 Firebase 存储来upfile. 但它不起作用这是我的代码。
FirebaseStorage storage = FirebaseStorage.getInstance();
StorageReference storageRef = storage.getReferenceFromUrl("gs://the-food-house.appspot.com/");
// Create a reference to "file"
StorageReference mStorage = storageRef.child("Album Avatar")
.child(UserUID)
.child(AvatarUser.getLastPathSegment());
mStorage.putFile(AvatarUser).addOnSuccessListener(new OnSuccessListener<UploadTask.TaskSnapshot>() {
@Override
public void onSuccess(UploadTask.TaskSnapshot taskSnapshot) {
Toast.makeText(SignUpWithEmail.this, "UPLOAD FILE OK", Toast.LENGTH_SHORT).show();
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(@NonNull Exception e) {
Log.d("ERROR", e.toString());
Toast.makeText(SignUpWithEmail.this, "Failed", Toast.LENGTH_SHORT).show();
}
};
Run Code Online (Sandbox Code Playgroud)
这是我遇到的错误:
com.google.firebase.storage.StorageException: An unknown error occurred, please check the HTTP result code and inner exception for server response.
Run Code Online (Sandbox Code Playgroud)
这是错误的详细信息:
Unrecognized GLES …Run Code Online (Sandbox Code Playgroud) 我有一个以字节为单位的图像:
print(image_bytes)
b'\xff\xd8\xff\xfe\x00\x10Lavc57.64.101\x00\xff\xdb\x00C\x00\x08\x04\x04\x04\x04\x04\x05\x05\x05\x05\x05\x05\x06\x06\x06\x06\x06\x06\x06\x06\x06\x06\x06\x06\x06\x07\x07\x07\x08\x08\x08\x07\x07\x07\x06\x06\x07\x07\x08\x08\x08\x08\t\t\t\x08\x08\x08\x08\t\t\n\n\n\x0c\x0c\x0b\x0b\x0e\x0e\x0e\x11\x11\x14\xff\xc4\x01\xa2\x00\x00\x01\x05\x01\x01\x01\x01\x01\x01\x00\x00\x00\x00\x00\x00\x00\x00\x01\x02\x03\x04\x05\x06\x07\x08\t\n\x0b\x01\x00\x03\x01\x01\x01\x01\x01\x01\x01\x01\x01\x00\x00\ ... some other stuff
我可以使用将其转换为NumPy数组Pillow:
image = numpy.array(Image.open(io.BytesIO(image_bytes)))
Run Code Online (Sandbox Code Playgroud)
但是我真的不喜欢使用枕头。有没有办法使用清晰的OpenCV,或者直接使用更好的NumPy,或者使用其他更快的库?
我正在使用 Spring Boot OAuth2Facebook 登录,但遇到错误:
JSON 解析错误:无法从 START_OBJECT 令牌反序列化 java.lang.String 的实例
相同的代码适用于 Google,登录按预期工作。我正在 Github 上关注此代码(https://github.com/callicoder/spring-boot-react-oauth2-social-login-demo)。
你能指导我解决这个问题吗?
@Override
protected void configure(HttpSecurity http) throws Exception {
http
.cors()
.and()
.sessionManagement()
.sessionCreationPolicy(SessionCreationPolicy.STATELESS)
.and()
.csrf()
.disable()
.formLogin()
.disable()
.httpBasic()
.disable()
.exceptionHandling()
.authenticationEntryPoint(new RestAuthenticationEntryPoint())
.and()
.authorizeRequests()
.antMatchers("/","/public/**",
"/login",
"/register",
"/error",
"/favicon.ico",
"/**/*.png",
"/**/*.gif",
"/**/*.svg",
"/**/*.jpg",
"/**/*.html",
"/fonts/*.*",
"/webfonts/*.*",
"/**/*.css",
"/**/*.js")
.permitAll()
.antMatchers("/auth/**", "/oauth2/**")
.permitAll()
.anyRequest()
.authenticated()
.and()
.oauth2Login()
.authorizationEndpoint()
.baseUri("/oauth2/authorize")
.authorizationRequestRepository
(cookieAuthorizationRequestRepository())
.and()
.redirectionEndpoint()
.baseUri("/oauth2/callback/*")
.and()
.userInfoEndpoint()
.userService(customOAuth2UserService)
.and() …Run Code Online (Sandbox Code Playgroud) android ×2
apache-spark ×2
java ×2
javascript ×2
numpy ×2
python ×2
scala ×2
angular ×1
data-science ×1
docker ×1
firebase ×1
node.js ×1
npm ×1
opencv ×1
proxy ×1
python-3.x ×1
react-native ×1
reactjs ×1
spring-boot ×1
vue.js ×1
webpack ×1