Angular2/RXJS - Handling Potentially Long Queries

use*_*572 5 javascript api http rxjs angular

Goal: Front-end of application allows users to select files from their local machines, and send the file names to a server. The server then matches those file names to files located on the server. The server will then return a list of all matching files.

Issue: This works great if you a user select less than a few hundred files, otherwise it can cause long response times. I do not want to limit the number of files a user can select, and I don't want to have to worry about the http requests timing out on the front-end.

Sample code so far:

//html on front-end to collect file information
<div>
    <input (change)="add_files($event)" type="file" multiple>
</div>

//function called from the front-end, which then calls the profile_service add_files function
//it passes along the $event object
add_files($event){

    this.profile_service.add_files($event).subscribe(
        data => console.log('request returned'),
        err => console.error(err),
        () => //update view function
    );       
}

//The following two functions are in my profile_service which is dependency injected into my componenet
//formats the event object for the eventual query
add_files(event_obj){

        let file_arr = [];
        let file_obj = event_obj.target.files;

        for(let key in file_obj){
            if (file_obj.hasOwnProperty(key)){
                file_arr.push(file_obj[key]['name'])
            }
        }

        let query_obj = {files:title_arr};

        return this.save_files(query_obj)
}

//here is where the actual request to the back-end is made
save_files(query_obj){

    let payload = JSON.stringify(query_obj);
    let headers = new Headers();

    headers.append('Content-Type', 'application/json');
    return this.http.post('https://some_url/api/1.0/collection',payload,{headers:headers})
        .map((res:Response) => res.json())
}
Run Code Online (Sandbox Code Playgroud)

Possible Solutions:

  1. Process requests in batches. Re-write the code so that the profile-service is only called with 25 files at a time, and upon each response call profile-service again with the next 25 files. If this is the best solution, is there an elegant way to do this with observables? If not, I will use recursive callbacks which should work fine.

  2. Have the endpoint return a generic response immediately like "file matches being uploaded and saved to your profile". Since all the matching files are persisted to a db on the backend, this would work and then I could have the front-end query the db every so often to get the current list of matching files. This seem ugly, but figured I'd throw it out there.

Any other solutions are welcome. Would be great to get a best-practice for handling this type of long-lasting query with angular2/observables in an elegant way.

pau*_*els 3

我建议您将搜索的文件数量分成可管理的批次,然后在返回结果时处理更多文件,即解决方案 #1。以下是未经测试但我认为完成此任务的相当优雅的方法:

add_files(event_obj){

    let file_arr = [];
    let file_obj = event_obj.target.files;

    for(let key in file_obj){
        if (file_obj.hasOwnProperty(key)){
            file_arr.push(file_obj[key]['name'])
        }
    }

    let self = this;
    let bufferedFiles = Observable.from(file_arr)
        .bufferCount(25); //Nice round number that you could play with

    return bufferedFiles

       //concatMap will make sure that each of your requests are not executed
       //until the previous completes. Then all the data is merged into a single output
       .concatMap((arr) => {

         let payload = JSON.stringify({files: arr});
         let headers = new Headers();
         hearders.append('Content-Type', 'application/json');

         //Use defer to make sure because http.post is eager
         //this makes it only execute after subscription
         return Observable.defer(() => 
            self.post('https://some_url/api/1.0/collection',payload, {headers:headers})
       }, resp => resp.json());
}
Run Code Online (Sandbox Code Playgroud)

concatMap通过阻止新请求直到前一个请求返回,可以防止服务器执行超过缓冲区大小的操作。mergeMap如果您希望它们全部并行执行,您也可以使用,但如果我没有记错的话,在这种情况下服务器似乎是资源限制。