Uploading data
Ingest data through the UI, SDK or connect your own cloud storage.
You can add data to your datasets on Mindkosh in a few different ways.
Uploading using the UI
You can upload most data types directly from the UI. To do this, either create a new dataset, or select an existing dataset. On the dataset page, click on the Add files button at the top. Then you can upload your data by simply dragging and dropping your files. Note that you will only see files with supported file formats for the selected dataset type.
Adding reference images to point clouds (for example for Sensor fusion use cases), is not supported through the UI. These can only be added using the SDK. The same applies to reference images for other images - for example, depth and thermal images to a RGB image.
Supported formats for Images
PNG (Single channel and RGB), JPG, JPEG, BMP, WEBP
Supported formats for point clouds
ASCII PCD, Binary PCD, Binary compressed PCD
You can only upload 1000 files at a time using the UI. If you have more files, you can upload them in chunks. However it is recommended that you use the SDK to upload large number of files.
Uploading using SDK
Please refer to the documentation on SDK to see how you can upload data using the SDK. Note that you need an SDK token to be able to use the SDK - please get in touch with us to get one.
Adding data from cloud storage
Instead of uploading the data to our servers, you can also connect your own Cloud storage to Mindkosh. In this setup, your data is never stored on Mindkosh servers, and is instead streamed directly from your storage to the user's browser.
To add data from your own cloud storage, you first need to add your cloud credentials to Mindkosh. This is a one-time process. Once you've added your credentials, you can add data any number of times without having to add credentials.
We currently support AWS S3, Microsoft Azure and Google Cloud storage.
To add data from your cloud storage:
Create a new Dataset.
Select User cloud as the storage location.
Select the appropriate Cloud provider and follow the steps outlined below.
AWS S3
MS Azure
Google cloud storage

When connecting your cloud storage with Mindkosh, there may be outbound data transfer charges extracted by your cloud storage provider. You may need to refer to your cloud provider's setup to see if this applies to you. As an example, in most AWS S3 setups, there is an outbound data transfer charge of USD 0.09 per GB of data that is transferred out of S3.
Adding data from AWS S3
Make sure you've added your credentials before creating the dataset. You can check out the guide to add your credentials for AWS S3 here. To add data from your S3 bucket, enter the following details.
Location The AWS region of your bucket Bucket name Enter your bucket name. Note that you need to enter just the bucket name, without prefixes like https or S3, or ending slashes.
Directory Enter the directory that contains the images/pointclouds you want to add to your dataset. Note that we only scan the root directory entered here and skip any sub-directories. Make sure your data is in the directory mentioned here.
For example, if your images are stored in this manner:
s3://example_bucket_name/sample_dataset/slice_1/image_0001.png
s3://example_bucket_name/sample_dataset/slice_1/image_0002.png
You would enter, example_bucket_name as the bucket, and sample_dataset/slice_1/ as the directory, and all images in this directory would be added to the dataset.
Adding data from MS Azure
Make sure you've added your credentials before creating the dataset. You can check out the guide to add your credentials for MS Azure. To add data from a container on Azure, enter the following details.
Location
The region of your container
Storage Account/Container name
Enter your storage account and container name separate by a /. For e.g. if your your storage account name is teststorageaccount and container name is testcontainer, you would enter the following - teststorageaccount/testcontainer
Directory
Enter the directory within the container that contains the images/pointclouds you want to add to your dataset. Only enter the directory inside the container, do not include the container name itself. If you files are in the root container - that is, they are not inside a directory, simply enter / in the directory box.
For example, if your images are stored in this manner:
testcontainer/sample_dataset/slice_1/image_0001.png
testcontainer/sample_dataset/slice_1/image_0002.png
You would enter sample_dataset/slice_1/ as the directory, and all images in this directory would be added to the dataset.
Adding data from Google cloud storage
Make sure you've added your credentials before creating the dataset. You can check out the guide to add your credentials for Google storage. To add data from a bucket on Google cloud, enter the following details.
Location The region of your bucket Bucket name Enter your bucket name. Note that you need to enter just the bucket name, without prefixes like https or ending slashes.
Directory
Enter the directory within the bucket that contains the images/pointclouds you want to add to your dataset. Only enter the directory inside the container, do not include the container name itself. If you files are in the root container - that is, they are not inside a directory, simply enter / in the directory box.
For example, if your images are stored in this manner:
testbucket/sample_dataset/slice_1/image_0001.png
testbucket/sample_dataset/slice_1/image_0002.png
You would enter sample_dataset/slice_1/ as the directory, and all images in this directory would be added to the dataset.
Last updated
Was this helpful?