运行Android应用程序时Eclipse中存在问题:
在"启动New_configuration(1)"期间发生内部错误.
项目路径必须只有一个段.
我想做那样的事.

这是一个列表视图行,其中包含名称和用户图像.
我做了一些搜索并完成了图像循环,但不是完美的解决方案.任何帮助都会帮助我.
我的代码添加到Image Loader类
public Bitmap processBitmap(Bitmap bitmap) {
int pixels = 0;
if (mRound == 0)
pixels = 120;
else
pixels = mRound;
Bitmap output = Bitmap.createBitmap(bitmap.getWidth(),
bitmap.getHeight(), Config.ARGB_8888);
Canvas canvas = new Canvas(output);
final int color = 0xff424242;
final Paint paint = new Paint();
final Rect rect = new Rect(0, 0, bitmap.getWidth(), bitmap.getHeight());
final RectF rectF = new RectF(rect);
final float roundPx = pixels;
paint.setAntiAlias(true);
canvas.drawARGB(0, 0, 0, 0);
paint.setColor(color);
canvas.drawRoundRect(rectF, roundPx, roundPx, paint);
paint.setXfermode(new PorterDuffXfermode(Mode.SRC_IN));
canvas.drawBitmap(bitmap, …Run Code Online (Sandbox Code Playgroud) I am working on ffmpeg for android. I have successfully compile ffmpeg-2.0.1 after that I make Android.mk file in my NDK's sources/ffmpeg-2.0.1/android/arm as
LOCAL_PATH:= $(call my-dir)
include $(CLEAR_VARS)
LOCAL_MODULE:= libavcodec
LOCAL_SRC_FILES:= lib/libavcodec-55.so
LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/include
include $(PREBUILT_SHARED_LIBRARY)
Run Code Online (Sandbox Code Playgroud)
After that make android project and in android project Android.mk file is as
LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)
LOCAL_MODULE := tutorial01
LOCAL_SRC_FILES := tutorial01.c
LOCAL_LDLIBS := -llog -ljnigraphics -lz
LOCAL_SHARED_LIBRARIES := libavformat libavcodec libswscale libavutil
include $(BUILD_SHARED_LIBRARY)
$(call import-module,ffmpeg-2.0.1/android/arm)
Run Code Online (Sandbox Code Playgroud)
but showing a …
我正在制作一个应用程序我最近3天无法解决一个问题google尽可能多.我在画布上制作一个圆圈并想要裁剪图像那部分并在缩放模式下显示该图像.我的第一步就像在屏幕上这样: - 
在这个我选择区域.我使用的代码就是这个.
private float x, y;
private boolean zooming = false;
private Paint mPaint;
private Matrix mmatrix;
private Shader mShader;
private Bitmap mBitmap;
private List<Point> mpoints;
private List<MyPoints> mpointlist;
private Path mpath;
private Canvas mcanvas;
private Bitmap mresult_bitmap, resultingImage,finalbitmap;
private Context mcontext;
private boolean bfirstpoint = false;
private Point mfirstpoint = null;
private Point mlastpoint = null;
public CircularZoomView(Context context) {
super(context);
mcontext = context;
mpath = new Path();
mpoints = new ArrayList<Point>();
setBackgroundResource(R.drawable.testing);
mPaint = new Paint();
mresult_bitmap …Run Code Online (Sandbox Code Playgroud) 我在android中制作自定义日历我的要求是使多个日期选择与图像中显示的任何一个有任何建议.现在我在视图上制作日历并尝试根据触摸绘制路径但它不适合我.这是我的代码: -
public class CalendarView extends View {
private float width; // width of one tile
private float height; // height of one tile
private int selX; // X index of selection
private int selY; // Y index of selection
private final Rect selRect = new Rect();
private GregorianCalendar month, itemmonth;// calendar instances.
private CalendarAdapter adapter;// adapter instance
private Context mContext ;
private GregorianCalendar pmonthmaxset;
private GregorianCalendar selectedDate;
private ArrayList<String> items;
private List<String> dayString;
private GregorianCalendar pmonth; // calendar instance …Run Code Online (Sandbox Code Playgroud) 我正在尝试订阅可观察对象,例如:
List<String> colors = Arrays.asList("RED", "BLACK", "WHITE", "GREEN", "YELLOW", "BROWN", "PURPUL", "BLUE");
Observable.just(colors).subscribe(s -> System.out.println(s));
Run Code Online (Sandbox Code Playgroud)
它工作正常,但如果我使用方法引用编译器会给出错误“void is not afunctional iterface”
谁能解释得有点深?根据我的说法,订阅者接受消费者功能接口,它不返回任何内容,但我们可以打印流数据,例如:
Observable.just(colors).subscribe(s -> System.out::println);// NOT COMPILE
Run Code Online (Sandbox Code Playgroud) 我过去几天在Ffmpeg工作,但无法获得任何有价值的输出.之后我按照下面的教程: -
http://www.roman10.net/how-to-build-ffmpeg-with-ndk-r9/
根据该教程.完成后,您应该能够找到
$NDK/sources/ffmpeg-2.0.1/android包含文件夹arm/lib和arm/include文件夹的文件夹.但我没有得到构建输出$NDK/sources/ffmpeg-2.0.1/android.请有人帮帮我.
谢谢.
我创建了一个gridview,它显示来自服务器的视频.GridItem有视频拇指图像和视频持续时间.用于加载视频的拇指我使用UniversalImageloader并通过使用asynctask.Lazyloading创建延迟加载来加载视频持续时间.但如果有人自由地滚动网格视图,则视频持续时间显示在错误的位置.用于创建Lazyloading我正在关注下面的链接
@Override
public View getView(int position, View convertView, ViewGroup parent) {
final TextView durationTextView;
View view = null;
if (convertView == null) {
view = mInflater.inflate(R.layout.camera_roll_item, parent, false);
MediaItemViewHolder mediaItemViewHolder = new MediaItemViewHolder();
mediaItemViewHolder.highlightTagIcon = (ImageView) view.findViewById(R.id.iv_media_grid_item_highlight_tag);
mediaItemViewHolder.mediaTypeIcon = (ImageView) view.findViewById(R.id.iv_media_grid_item_type);
mediaItemViewHolder.mediaClipLength = (TextView) view.findViewById(R.id.tv_media_grid_item_length);
mediaItemViewHolder.mediaThumbnail = (ImageView) view.findViewById(R.id.iv_media_grid_item_thumbnail);
mediaItemViewHolder.cameraItemSelectedView = (RelativeLayout) view.findViewById(R.id.rl_item_selection_parent);
mediaItemViewHolder.progressContainer = (RelativeLayout) view.findViewById(R.id.rl_grid_loader_parent);
view.setTag(mediaItemViewHolder);
} else {
view = convertView;//(MediaItemViewHolder) convertView.getTag();
//mediaItemViewHolder.mediaClipLength.setText("");
Log.i(TAG, "set blank to ");
}
MediaItemViewHolder mediaItemViewHolder = (MediaItemViewHolder) convertView.getTag();
durationTextView = …Run Code Online (Sandbox Code Playgroud) 通过下面的步骤成功构建ffmpeg之后
http://stackoverflow.com/questions/22471514/ffmpeg-build-output-is-not-showing
Now I have copy include and all .a file into my JNI folder. and my Android.mk
file is as:-
LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)
LOCAL_MODULE := ffmpegutils
LOCAL_SRC_FILES := tutorial02.c
LOCAL_C_INCLUDES := $(LOCAL_PATH)/include
LOCAL_LDLIBS := -L$(NDK_PLATFORMS_ROOT)/$(TARGET_PLATFORM)/arch-arm/usr/lib -L$(LOCAL_PATH) -lavformat -lavcodec -lavfilter -lavutil -lswscale -llog -ljnigraphics -lz -ldl -lgcc
include $(BUILD_SHARED_LIBRARY)
Run Code Online (Sandbox Code Playgroud)
但我的项目给出了错误: -
Description Resource Path Location Type
make: *** [obj/local/armeabi/libffmpegutils.so] Error 1 MainActivity C/C++ Problem
undefined reference to 'ANativeWindow_unlockAndPost' MainActivity line 231, external location: /home/kiwitech/Documents/development/tools/ndk/android-ndk-r9d/toolchains/arm-linux-androideabi-4.6/prebuilt/linux-x86/arm-linux-androideabi/bin/ld: ./obj/local/armeabi/objs/ffmpegutils/tutorial02.o: in function …Run Code Online (Sandbox Code Playgroud) 我正在制作一个游戏,其中玩家("鲍勃")垂直移动并持续收集硬币.如果玩家没有设法收集任何硬币5秒钟,"鲍勃"开始下降.随着时间的推移,他会更快倒下.
我的问题是:如何跟踪LibGDX(Java)应用程序中的已用时间?
示例代码如下.
public void update (float deltaTime)
{
`velocity.add(accel.x * deltaTime,accel.y*deltaTime);`
position.add(velocity.x * deltaTime, velocity.y * deltaTime);
bounds.x = position.x - bounds.width / 2;
bounds.y = position.y - bounds.height / 2;
if (velocity.y > 0 && state == BOB_COLLECT_COINE)
{
if (state== BOB_STATE_JUMP)
{
state = BOB_STATE_Increase;
stateTime = 0;
}
else
{
if(state != BOB_STATE_JUMP)
{
state = BOB_STATE_JUMP;//BOB_STATE_JUMP
stateTime = 0;
}
}
}
if (velocity.y < 0 && state != BOB_COLLECT_COINE)
{
if (state != …Run Code Online (Sandbox Code Playgroud)