Put multiple files and directories to a one file database [closed]

I’ve been creating my own type of database for a while now without knowing anything basics about databases. So I ended up saving it in directories.

I recently found out that most databases uses a single file to operate, and not millions of files. Because looking through one file takes less memory to operate.

Each file contains about 6 lines of information about the “filename”. It contains Title, category, popularity, etc. and no randomness. It is stored at a location looking like this:

database/*/*/*.txt

The two first stars describes the category of the file, and the bottom layer is where the files are saved.

What database application would be easy to convert and what would fit best for this solution?

Server is running Debian i3 4 GB RAM and I have a windows PC available as well.

Answer

This question is asking for too much. It’s not just a click and go. I thought more people had problems like this, but then I realized that everyone are using premade search engines.

I ended up download the whole database to my windows computer, and code a program in c# that automatically goes through all files, gets the content and POST it to an elasticsearch database which I installed on the Debian server. I should probably have created a file to file converter which I later can just put straight into the database but ended up doing a file to pure POST request.

Drawback of doing this, is that speeds are not too high and it took 2 hours to transfer 700 000 files to the database.

Program will not be released publicly because of specific strings I used in the files. So this was way harder than I expected.

Attribution
Source : Link , Question Author : Typewar , Answer Author : Typewar

Leave a Comment