Handle Multiple File Uploads In Laravel Vapor (Serverless)

Photo of author

Fikri Mastor

No Comments

Code

For this tutorial, I believe you must have experience with the best and hassle-free deployment of the application when using a serverless platform.

Let’s get started with the content of this post.

Challenge

The challenges we have to face while using serverless:

  • S3 does not support multiple file uploads.
  • Max 4.5mb for file uploading, limitation of AWS Lambda.

Requirements

I’ll be using this stack for this example. We’ll handle file upload in the front end using a package from Vapor.

The Flow

In blade file, under between <script></script> tag section we’ll add several lines of javascript codes. I got this idea/advice from my fellow worker which more experience/familiar with Javascript.

        async handleFileUpload() {
            await handleUpload();
            await handlePostUpload();
        }

        /** Processing file upload */
        async function handleUpload(files) {
            console.log("Processing file");

            for (const file of files) {
                filesData.push(await vaporUpload(file));
            }

            console.log("File processed successfully");
        }

        /** Process the file in backend */
        function handlePostUpload(files) {
            axios.post("{{ route('file_upload.file-upload-vapor') }}")
        }

        /** Stream file to S3 */
        function vaporUpload(item) {
            return Vapor.store(item, {
                progress: progress => {
                    this.uploadProgress = Math.round(progress * 100);
                }
            }).then(response => {
                console.log(response, item);

                return {
                    "key": response.key,
                    "name": item.name,
                    "extension": response.extension,
                    "size": item.size,
                };
            });
        }

Step 1

When using livewire components, we can use alpine js syntax like @click.

<input @click.self="openError = false" type="file" id="fileUploads" name='file[]'
class="rounded-sm border border-gray-700 p-1" multiple>

<button id="fileUploads"
            @click="handleFileUpload()"
            class="mx-2 my-1 flex cursor-pointer items-center justify-center rounded bg-blue-400 px-3 py-2 text-[13px] font-bold text-white hover:bg-blue-600 lg:my-0 lg:mx-1"
            type="submit">
            <svg xmlns="http://www.w3.org/2000/svg" fill="none" viewBox="0 0 24 24" stroke-width="1.5"
                stroke="currentColor" class="h-3 w-3">
                <path stroke-linecap="round" stroke-linejoin="round"
                    d="M3 16.5v2.25A2.25 2.25 0 005.25 21h13.5A2.25 2.25 0 0021 18.75V16.5m-13.5-9L12 3m0 0l4.5 4.5M12 3v13.5" />
            </svg>

            <span class="ml-2">
                Upload
            </span>
        </button>

For example, we have a button which is have @click="handleFileUpload()" like this, and this button actually will trigger a function we add underneath <script></script> tag below.

        async handleFileUpload() {
            await handleUpload();
            await handlePostUpload();
        }

The button will start to trigger the function we write in javascript. We purposely want to wait for every single file to stream directly to S3 bucket first, and that’s why the reason we use await to ensure all the tasks are completed before proceeding to the next task.

Step 2

        /** Processing file upload */
        async function handleUpload(files) {
            console.log("Processing file");

            for (const file of files) {
                filesData.push(await vaporUpload(file));
            }

            console.log("File processed successfully");
        }

Under handleUpload(files) function, we do some looping to stream files directly to S3, one by one. And we call another function to stream the files to S3 using the official Vapor package as stated in the documentation.

Install the laravel-vapor NPM packages.

npm install --save-dev laravel-vapor

Then, initialize the global Vapor JavaScript object inside app.js:

window.Vapor = require('laravel-vapor');
      /** Stream file to S3 */
        function vaporUpload(item) {
            return Vapor.store(item, {
                progress: progress => {
                    this.uploadProgress = Math.round(progress * 100);
                }
            }).then(response => {
                console.log(response, item);

                return {
                    "key": response.key,
                    "name": item.name,
                    "extension": response.extension,
                    "size": formatBytes(item.size),
                };
            });
        }

So vaporUpload(item) will stream single file directly to the S3 buckets and stored in tmp folder.

Step 3

Finally, aftter all file successfully in tmp directory inside S3 buckets, next function handlePostUpload(files) will be triggered.

        /** Process the file in backend */
        function handlePostUpload(files) {
            axios.post("{{ route('file_upload.file-upload-vapor') }}")
        }

This function will sent post request to an API endpoint to check if the file type and anything you want to check will be processed in this API.

I think that’s all how to handle multiple files uploads while using serverless technology via Laravel Vapor. Till next time.


Check out Sendy, a self hosted newsletter app that lets you send emails 100x cheaper via Amazon SES.

Newsletter

Mahu Dapatkan Info Pantas?

Langgan email secara percuma. Mana tahu ada tawaran istimewa!


Leave a Comment