Development Updates info.btree.at

Static sites are always the easiest also in my head, I will also not use any SSR in the near future. Maybe looking into PWA or generating an Electron app but that are just thoughts. For building GitHub actions are very powerful, for example my personal homepage is automatically build and uploaded when I commit to the master branch. It also removes Amazon AWS cache, which I use for asset caching (probably not needed for my 2 visitors a month but I like to over-engineer things ;) ), you can look at my workflow here: btree_info/.github/workflows/deploy.yml at master · HannesOberreiter/btree_info · GitHub

If you have multiple persons working on a project, a good Git setup and contribution documentation is a must have, in addition to test cases. Otherwise you need to check every piece of changed code extensively.

Cheers
Hannes

Hi all,

frontend decisions next part. After some mapping fun last time, I did struggle a little bit with my build setup with webpack. After some research I came to a “modern” tool for frontend dev https://vitejs.dev/. It is actually not that new (2019) and I don’t know why I skipped it, although it is the recommended build tool for vue3 (you can also use it with React). The transition from webpack was fast as the settings for Vite are pretty simple and the tool itself is really blazing fast (see video, last part were I update a nested component). The big advantage of Vite is that it only rebuild components (HRM), so no need to rebuild the full app on each edit. At least that is how much I understand it.

After getting the build setup running again, the next task was again “chores”, eg. user settings, credentials editing and company settings. On this note, I have a really “bad” user management system as when I started the database and app I never though about multiple “users” and “companies”. If I would create a new database from “scratch”, I would use and absolutely recommend a third party user authentication system (auth3, openID etc.). Not only to prevent my own headaches but also to improve security.

Cheers
Hannes

1 Like

Just FYI, I can see the shared videos as web embedded video with Chrome only and not with FF. So in case you got an error

The media could not be loaded, either because the server or network failed or because the format is not supported.

Try an other Browser. Btw thx for sharing Hannes!

Thanks for the info. Yeah you are correct, although it should be supported: WebM video format | Can I use... Support tables for HTML5, CSS3, etc

The same is true for Safari, where it also wont work inside the browser.

I may make a playlist on YouTube were it gets auto converted.

Edit: YouTube Playlist App Dev - YouTube

Cheers
Hannes

Hi all,

frontend and little bit backend decisions next part. While happily building my frontend I come to some brain twists with my “REST” api design, but I never followed REST design strictly anyway.

Some cases:

  • Deleting or getting thousand rows at the same time with IDs, with REST I’m somewhat limited to the max url length as it wont take by definition body params.
  • Special endpoints for something like moving dates of tasks in a calendar, were I felt the work in the backend is more future proof than offloading everything on the frontend part

I already said at the beginning in my backend decisions that I want to use as much business logic as possible in the backend. Which seems to not play nicely if you want to have a strict REST design.

Nevertheless, after some fine-tuning I finally archived my first datatable, which is kinda a big step for me as I want to show most of my app data as tables.

Some points I wanted to have at minimum: server side loading of big data, manage save state in local storage, allow custom column display, basic search and ordering. Make it composable for my other tables. There are still some bugs and mini optimisations needed but I’m already quite happy with the result and looking forward to building my other 20… tables :).

https://youtu.be/eWj1f8fxhME

Cheers
Hannes

2 Likes

Hi all,

are there any MySQL Database magicians in the Hiveeyes community? Currently running into performance issues with my VIEWS and would need some help, don’t want to go back to raw queries in my code (this is what I did on my current app to improve the performance).

Of course I am willing to pay pocket money for help.

I can give a database dump with example data and here is the raw SQL code (*_view_.*.sql Files) (especially 20220125124845_view_queens_locations.sql; 20220125123313_view_hives_locations.sql, 20220125152944_view_tasks_apiaries.sql)

Cheers
Hannes

[edit] For further discussion and the solution see the separate topic Problems with MySQL/MariaDB performance

1 Like

Hi all,

after my short excursion into database engineering for dummies I’m back on track building stuff. In my database schema I may or may not made some bad future decisions. For my own sake of sanity I introduced multiple of redundant columns, which improves my code a lot and reduces the use of VIEWS in queries.

My original design of “tasks” (eg. feeding, treatment …) was like this:

task - 1:1 - hive - 1:n - movedate - n:1 - apiary - 1:1 - company

This means to figure out if a “task” belongs to the current user I had to go the “whole” route down. Now I introduced redundant user_id columns, to better filter out some tables, without relying on connections and VIEWS.

task - n:1 - company
hive - n:1 - company
task - n:1 - hive - 1:n - movedate - n:1 - apiary - n:1 - company

I also don’t have to care for cascades “that” much anymore, as now a “task” could exist without a hive or apiary. It actually should not happen but it eases my mind.

As for the frontend I was able to make a lot of “bigger” junks. This also come with the realisation that I really need to start soon writing tests, which I’m completely ignore in my whole self-taught programming career. So far I’m leaning towards starting with e2e tests and to reduce the need for “new” languages I’m thinking about using https://www.cypress.io/ for frontend and backend API testing. I don’t know the downsides of using cypress for the backend but for me it is probably better to try to learn one tool and not different flavours again.

Short video of my finished integration of Dropbox, which brought took me around 4 days of pain but I somehow managed. The official SDK is has a lot of types missing for TypeScript and also the documentation was lacking. The integration is also available on my current app but this time a lot cleaner and more secure with access and refresh token. Currently I also support uploading to my own server which I will not support in the new app.

Cheers
Hannes

Hi all,

beekeeping season is slowly kicking in, but still manage to get time in for my app development.

A little side-story why I come to love TypeScript. Few days ago I had a bug report in my old code by a user that my queen rearing logic wasn’t working for some rearing methods of his. After a lot of trial and errors the solution was rather simple, it was a type error which did only matter if there were more than 10 rearing steps:

if (value.position > sel_position) // "10" > "5" = false
if (parseInt(value.position) > parseInt(sel_position)) // 10 > 5 = true
// https://javascript.info/comparison#string-comparison

Now to some stats, as I love them and I would say I’m around 75% finished with reimplementation of my old code and going forward there will be a lot of changes from structure to the old app.

  • Old Frontend:
    • 234 files, 46.318 lines of codes
  • New Frontend:
    • 188 files, 18.769 lines of codes
  • Old Backend:
    • 61 files, 11.048 lines of codes
  • New Backend:
    • 138 files, 9.052 lines of codes

For the frontend I could trim down a lot of code as in my old version I had a lot of redundant code pieces. In addition Vue.js helps me a lot to reduce code chunks, especially with easy component system.

The new backend has more than double of the old file numbers, which is due to my “new” thinking of going away from behemoth single files and use smaller files for different logic, one example in the old app I had one router file (controller.php) now each main route has their own file (apiary.route.ts, hive.route.ts, …).

On the frontend part I also try to implement “best” practices from Vue.js. One of this is a flat file structure. Initially I liked it a lot but currently with growing number of files I’m little bit unsure, as the components directory keeps growing:

Although the problem lies with me, as I’m still open files manually via mouse-clicking. I really need to keep open files with keyboard searching, then the flat file structure really makes sense and works great:

Lastly a video of my queen rearing logic, which was a big part and I’m quite happy to reimplemented it a lot better than in my old app:

Cheers
Hannes

2 Likes

Hi all,

last week was heavy for me. I created my first technical alpha to test deployment and build automatic. Again lots of new stuff to learn. Now I know why google firebase and similar tools are loved by developers as building a full stack app from scratch is really time demanding and though.

First I had to upgrade my testing server as it was really slow when compiling and serving stuff. Current setup looks like this and its pretty fast (without user load that is ;)):

The +10GB volume is reserved for my database as the lokal disk is not “stable” and could be lost if the server goes down. The server does backup each day and is setup with ngnix for http server and reverse proxy, certbot for automatic setup of SSL certificates, docker for my API server image and database image.

In addition I did setup a cronjob which auto updates / upgrades server packages and restarts daily in case of any memory leaks. Don’t know if that is a good practice, but still better than missing any crucial security patches.

Many trials and errors later I managed to setup ngnix and how to use reverse proxy with Docker containers. It seems to be a good idea to set for your docker container a subnet mask for your network, as it did change two times and my reverse proxy did not work and I did not know why.

Now my docker-compose.yml looks like this:

...
networks:
  btree-db-network:
    driver: bridge
    ipam:
      config:
        - subnet: 172.18.0.0/16

and my upstream.conf like this:

# path: /etc/nginx/conf.d/upstream.conf
upstream btree_at_api {
    server 172.18.0.1:1338; # Gateway + Port
}

After setting up the api and starting my Docker containers I had to plan for Database backups. After a little bit of search I found out about databack/mysql-backup a Docker container which job is exactly this, it was rather easy to setup. The only thing was, where to save the backups.

Although Amazon AWS would be a good choice (only downside that you support Amazon domination of the market) and it would be supported by the docker container, I though I go the challenging way as I have a secure Nextcloud server running, so the “easiest” solution was to setup a connection to it. Nextcloud uses WebDAV so I had to install davfs2 as driver on my server to be able to mound the Nextcloud disk. Thanks to a rather straight forward guide which I could follow I somehow managed to do it: Guide Mount Nextcloud WebDAV on Linux

After the backend was done, I moved on to the frontend. Which is “only” static as I do not use SSR, so deployment is easy. Setup a GitHub action to autobuild and push with sftp onto the server www folder. It was my first SPA and did run into problems, when one refreshes the browser. I solved it by using a ry_files $uri $uri/ /index.html; code inside my ngnix config:

server {
    ....
   # This part is to remove the service worker from cache, also very important when building a PWA app
   location = /service-worker.js {
        expires off;
        add_header Cache-Control no-cache;
        access_log off;
    }
   .....
    # SPA reload bug workaround https://megamorf.gitlab.io/2020/07/18/fix-404-error-when-opening-spa-urls/
   location / {
      try_files $uri $uri/ /index.html;
    }

}

Now everything runs more or less smooth, the last part was to auto build the Docker container from GitHub for my API. But the Docker Auto-Build feature was disabled in 2021 (Changes to Docker Hub Autobuilds - Docker), thankfully there are already GitHub actions which build and push the container to DockerHub.

Lastly as previous mentioned I started to begin building e2e tests for my backend. I did give up on cypress as there was no clear HowDo for API. Now I’m going with mocha and chai combo and supertest for the http access. It again took me quite a while to get it running (main problem was that my node server was not closing after mocha testing, found out after hours with nodewtf that nodemailer was the problem and I introduced a memory leak, so I solved, already a good point of testing :) )

mocha test

Cheers
Hannes

Hi all,

this week was writing tests for my backend and the CI test implementation.

First of all, I’m really happy that I forced myself to write tests, already found a few bugs which I had surely missed before going live. The testing also seems rather fast, which means I can keep it running while development.

Currently I have written close to 500 test cases and some of them nested, which comes down on my local machine to ~8s.

testing-local

On GitHub for my CI implementation it boils down to ~1-2 minutes.

Pollution of test database

I did not know how to handle the continuous pollution of my test database, as the test are automatically run on change. First I tried to drop the database and create a new one, but this left me with some user permission problems. Next up I tried to rollback all migrations and then migrate back up which took quite a long time, as I have already written a lot of migration files.

My final solution was to a) migrate to latest database version b) truncate all tables (not needed on CI). b) was solved with some raw SQL queries, first I fetched the table names from the schema information and then had to remove the foreign key security and loop over all tables, here is the full code which is run before the tests:

before(async function () {
    this.timeout(10000); // Standard time-out for mocha is 2s
    console.log('  knex migrate latest ...');
    await knexInstance.migrate.latest();
    if (process.env.ENVIRONMENT !== 'ci') {
      console.log('  knex truncate tables ...');
      //knexInstance.migrate.rollback({ all: true })
      await knexInstance.raw('SET FOREIGN_KEY_CHECKS = 0;');
      const tables = await knexInstance
        .table('information_schema.tables')
        .select('table_name', 'table_schema', 'table_type')
        .where('table_type', 'BASE TABLE')
        .where('table_schema', knexConfig.connection.database);
      for (t of tables) {
        if (
          !(
            ['KnexMigrations', 'KnexMigrations_lock'].includes(t.TABLE_NAME) ||
            t.TABLE_NAME.includes('innodb')
          )
        )
          await knexInstance.raw(`TRUNCATE ${t.TABLE_NAME};`);
      }
      await knexInstance.raw('SET FOREIGN_KEY_CHECKS = 1;');
    }
    global.app = require(process.cwd() + '/dist/api/app.bootstrap');
    global.server = global.app.server;
})

Mocha peculiarities

Here are some cases which took me quite a while to figure out, with mocha.

done()

Normally you have to close your tests and before, after cases with done(), eg:

  before((done) => {
    ....
    done();
  });

If you use promises you should not use the done() return.

  before(async () => {
    await Promise()
  });

global

If you want to use variables over multiple testing files there is a global variable, eg for your test user login.

global.demoUser = {
  email: `test@btree.at`,
  password: 'test_btree',
  name: 'Test Beekeeper',
  lang: 'en',
  newsletter: false,
  source: '0',
};

Closing server

As last time mentioned I had to use nodewtf to find out why mocha was not auto closing. This time it was knex which did not close my connection, so you have to use knex.destroy().

  after((done) => {
    global.app.boot.stop();
    global.app.dbServer.stop();
    knexInstance.destroy();
    done();
  });

CI / GitHub Action

The goal was to automatically run the test if a pull on the main branch is happening. This one was again a little bit tricky as you cannot really test it on your local machine.

I did first play around how to create a database, the first idea was to use a service container (which is a Docker container). But after a while I figured out that on Linux there is actually SQL installed but not active and you need only to start it.

# Start SQL
 sudo systemctl start mysql
# Create our testing database
mysql -e 'CREATE DATABASE ${{ env.DB_DATABASE }};' -u${{ env.DB_USER }} -p${{ env.DB_PASSWORD }}

Next up was again permission problems, as newer MySQL does not allow simple password access, which after some googling I could solve with some SQL commands.

# Change identifier method for our testing user
mysql -e "ALTER USER '${{ env.DB_USER }}'@'localhost' IDENTIFIED WITH mysql_native_password BY '${{ env.DB_PASSWORD }}';" -u${{ env.DB_USER }} -p${{ env.DB_PASSWORD }}
# Let MySQL know that privileges changed 
mysql -e "flush privileges;" -u${{ env.DB_USER }} -p${{ env.DB_PASSWORD }}

The final working action can be found here: btree_server/test.yml at main · HannesOberreiter/btree_server · GitHub

Cheers
Hannes

1 Like

Hi all,

beekeeping season is going strong and last queen rearing series is done for this year. Quite happy with the outcome this year got 42 new queens ready, more than enough for myself.

Development has slowed down a little bit due to lack of time, but still writing test. I also started writing tests for the frontend part, this time now with cypress. Overall my two main impressions for cypress e2e are:

  • positive: you can see the results
  • negative: quite slow to test the whole app

It is probably something you only do before publishing a new version. As you cannot really let all tests run while development. But you can also only run a subset of tests for example a new page or form you are writing, this would work in parallel without any problems.

Anyway feel a little bit out-powered for writing any more tests and will probably continue writing features and maybe finish up the backend (ical, public API for hive scales, statistics).

Here is a video of the integration test, as you see in the end it only tests if the page is present and loaded, this part of course needs more in depth tests in future.

Cheers
Hannes

3 Likes

Hi all,

after my unwillingness to write more tests to prevent burn-out. I did add to the server side the public API for hive scales and iCal.

As for hive scales, it was a lot easier to manage basic validation thanks to express-validator and bruteforce, timelimit and db post was rather quickly set up in comparison to my old PHP self-made behemoth.

Example of my route validation for hive scale api calls:

Validator.validate([
          param('ident').isString(),
          param('api').isString(),
          query('action').exists().isString().isIn(['CREATE', 'CREATE_DEMO']),
          query('datetime').optional().isISO8601().toDate(),
          query('weight').optional().isNumeric().toFloat(),
          query('temp1').optional().isNumeric().toFloat(),
          query('temp2').optional().isNumeric().toFloat(),
          query('hum').optional().isNumeric().toFloat(),
          query('rain').optional().isNumeric().toFloat(),
          query('note').optional().isString().isLength({ max: 300 }),
        ]),

Generating iCal endpoints was exceptional easier than I used to do it, thanks to ical-generator. There are really plentiful of node packages and I’m really happy for my decision to write the server side on node.

Example of my the user API page on the frontend, which I’m proud of myself how clean I managed to get it:

On the frontend part I started with creating statistics which I really enjoy. First I wanted to create everything with d3 but now using echarts as it is better integrated into vue.

Here a small video of the current implementation:

Lastly currently I’m in alpha stage and some users already testing the app. If anyone feels like giving feedback and doing a little bit of “bug” hunting send me a message (this is not meant as public beta, so if you only want to check out the app please be patient as I will do a public beta in the near future).

Cheers
Hannes

1 Like

Hi all,

little bit lacking with my development speed, due to more work in the forest. In addition we had a windstorm a few days ago, which means even more work…

Nevertheless, after some crucial bug fixes I now opened the new version for open beta testing. If you are already an user of my application you can login with your credentials. The database is a copy of today (2022-08-22) and it is not connected to the live app.

Login: https://beta.btree.at/

In addition I created a test user, as you cannot register a new account on the beta page (my email logic is not finished).

Test-User: messe@btree.at
PW: messe_btree

Theoretically you could delete the demo account, I did not care to protect it. If the demo account is broken, please don’t hesitate to tell me. Then I need to reset the database.

Cheers
Hannes

Hi all,

the past few days I did a redesign of my info page, which will again hold documentation for my application and some personal related stuff.

My previous version was made with Nuxt (framework for Vue), but the overhead always felt a little bit high and I did not like the SPA feeling for a simple static page. On the search for alternatives I stumbled on Hugo, VitePress and Astro.

  • Hugo: After some messing around, it was a little bit too much effort and learning too much new stuff for me.
  • VitePress: Would be the quickest transition but did not want again an SPA

Finally I looked into Astro, the “new” kid on the block and recently reached a stable version 1.x.x. They promise speed by using an island design strategy and you can implement more or less all JavaScript frameworks you like. It also supports content in Markdown format without any additional plugins.

After some initial starting problems how to use components inside markdown files (hint: you need to use .mdx files). It was a very smooth transition from my Nuxt page to Astro.

The Markdown files were quickly set up with the correct frontmatter. An overall documentation template style was quickly adapted to my needs and the integration of my Vue components did work mostly without too much hassle.

After a few days of coding everything was running with auto deployment on my server via GitHub actions. The page runs perfectly fine and with incredible speed. I still have hydration errors, because of my dark mode and language Vue component (which I have no idea how to fix it), but don’t really mind the flickering so far.

Conclusion: Page speed is impressive, easy to adapt framework, nice documentation.

GitHub Repo: GitHub - HannesOberreiter/btree_info: Landing Page for Beekeeping App and Personal Homepage, made with Astro
Homepage: https://info.btree.at/

Cheers
Hannes

Many thanks that you share this journey with us! And respect that you change the horse while running, often people stay on the tools they are used to use because they use them :-) so sometimes not a good reason.

Hi clemens,

it is probably a selfish reason that I write here, as it gives me time to think what I archived so far and when writing it I often get more or better ideas. Explaining the problem is the first step to problem solving :nerd_face:.

In web dev you probably cannot adapt fast enough to new things anyway, but with astro I feel really good as it connects all the frameworks and you can easily switch out parts. Something similar I have written at the beginning when deciding on my frontend (9th March, 2022):

Selecting a framework for frontend could be easy but if you research the various javascript frameworks you kinda feel overwhelmed, as you don’t want to invest time into a technology which may be obsolete again in a year. The biggest out there are probably React and Angular and I played around with both, but it felt too big for me when starting with it. After the big ones I looked at “fresher” starts Svelte and Vue.js, both felt very good with a lower entry barrier in my opinion.

The second reason why I like to write here about my journey is that it gives me a history of whats changed. Fun fact by pure chance I recently found some images of my very first version, which I presented in 2015. It was made with jQuery, PHP and self written CSS. I did even write a secondary version with jquery-mobile as framework. Crazy first steps into programming, when I think about it.

Sometimes I question my own sanity, that I keep on programming on my app in my free time. The reasons are a few users, which started using the app since the start and are still using it. To make something people use fills me with joy, I don’t know why but it is as it is.

Cheers
Hannes

3 Likes

Hi all,

working on my final touches before going live with the newest version, which should be this month. As I really want the new version running before the 2. eurobee in Germany, where I will be present.

CORS & Open API

Hate it or love it CORS is a pain. For security reason I only whitelist certain addresses to access my backend API. This caused some headaches with my open API for iCal and hive scales. One would thinks it is easy to lift CORS logic for only one route but it was harder than I though. Nevertheless after some trail and errors I finally managed to get it running in a somewhat clean way.

I use the node cors package and inject it into my main app at the beginning as global middleware, as I have many routes and use containers I did not want do explicit set CORS for each individual route. Therefore I needed a way around that. I somewhat did not read that you can set the CORS settings asynchronously but after I found out that you are able to do it, I came up with this solution:

// Inject cors middleware globablly and add settings as function
app.use(Cors(this.corsOptionsDelegate));
  /**
   * @description Configuring CORS asynchronously, will disable CORS for /external/ route
   */
  private corsOptionsDelegate = function (req, callback) {
    const origin = req.header('Origin'); // before this I have a custom middleware which always sets an origin, also if not given
    // Main settings
    let corsOptions = {
      origin: true,
      methods: ['GET', 'POST', 'PUT', 'PATCH', 'DELETE'],
      allowedHeaders: [
        'Accept',
        'Content-Type',
        'Authorization',
        'Origin',
        'From',
      ],
    } as any;
    let error = null;

    if (req.url.indexOf('external') !== -1) {
      // Allow API calls to scale and iCal without CORS
      corsOptions = { origin: false };
    } else if (
      authorized.indexOf(origin) === -1 &&
      origin &&
      env !== ENVIRONMENT.development
    ) {
      error = notAcceptable(`Domain not allowed by CORS: ${origin}`);
    }
    callback(error, corsOptions);
  };

Now everything and everyone should be able to push data or receive data from my API. I also updated the documentation, to reflect the changes for the new beta version. As for hive scale data, I simple use a GET request with the transferred data as params. I think this is the easiest solution and can be quickly adapted by users, as they simply need to change the url and add variable params in their code.

In addition I added a simple weight graph for the hive scale data, nothing too fancy and not compare able to the hiveeyes graphana dashboards, but good enough for me.

QR Codes

I did have the ability to scan and write QR codes in an older version of my app but discontinued it. Recently I checked again the usability and it is a lot better now. Therefore I create the option to scan and create QR codes for your colonies. The scanner works on all devices I tested (laptop mac, iPad, Android) and is very fast. Although, I simply save the url to the detail view of colonies, it has an advantage function to automatically select scanned hives if there is a hive dropdown on the current window. This should allow very fast working in the field if you have a couple hundred hives and don’t want to scroll through the list. It now feels even faster than NFC which was previously my goto option. But the big difference is still that the NFC scanner is always active in the background (if you activate it) and I can simply scan the tag. For the QR tag I need to open the camera dialog. There are pros and cons as with everything in life.

Cheers
Hannes

1 Like

Hi all,

last week I attended the aquaculture conference in Italy, Rimini and saw a lot of parallels to beekeeping, but they make more effort in topics like sustainable, environmental friendly and fish wellbeing, which I really liked. As vegetarian I think we should not farm fish anyway, but yeah aquaculture does also include stuff like algae farming. Of course the most interesting for me where the applied computer science and robotics talks.

Nevertheless, yesterday was a long day for me, the live migration to my new application version was happening. As this is my third “big” migration I already knew how to prepare myself.

This were my steps, which helped me a lot:

  • notify users inside the old app about the time of migration (~1 month before)
  • create a list in order you need to do for migration, eg. set old app into maintenance mode, reload live server with all new packages, create DNS and redirects …
  • health tips: yoga in the morning and prepare lunch the day before migration
  • backup backup backup and be ready to reroll in case of failure

Overall everything worked as expected, the database after migration needed again a “repair”/“optimisation” but I was already prepared for it:

mysqlcheck -u -p -h -P --auto-repair --optimize --all-databases --verbose

Had some docker container starting failures due to missing env entry. Which did me cost some nerve to find but was ok after a few minutes of sweating.

After the service went online again and DNS was already changed, a user reported he could not log into the app. This was quite frustrating as I could not figure out why, but lastly the user itself gave the hint.

In the new version I clean the emails with normalizeEmail from express-validator. I did not know but gmail addresses with points eg. max.muster@gmail.com will be normalised to maxmuster@gmail.com. Of course after normalisation the new email is not the same as the registered email because in my old app I did not normalise it. Therefore, the user could not log into the app and could not reset his password. Therefore a quick decision was made to not normalise emails at all (the same as in the old app), this may cause pain in the future if one user mixes up upper case or lower case.

Next pain were the AppStore and PlayStore apps, I would love to drop them completely but lots of people want it and simply don’t know how to install PWA. Android was quickly archived and everything works more or less. iOS was a lot more pain, also because I really have zero experience with XCode and swift programming language. In addition iOS does not really like the kind of wrapper apps and don’t support PWAs. One main thing what did break the iOS app was printing, I solved it by disabling it. Not the best approach but yeah. Maybe someone here is more into swift programming? Currently the code looks like this:

 // redirect new tabs to main webview
 func webView(_ webView: WKWebView, createWebViewWith configuration: WKWebViewConfiguration, for navigationAction: WKNavigationAction, windowFeatures: WKWindowFeatures) -> WKWebView? {

        if (navigationAction.targetFrame == nil) {
            let app = UIApplication.shared;
            if(navigationAction.request.url?.absoluteString != ""){
                let url = navigationAction.request.url!;
                if !app.canOpenURL(url){
                    webView.load(navigationAction.request)
                }
            } else {
                // Notify user that printing is not allowed (blank new window, without url)
                self.notif()
            }
        }
        return nil
    }

Another problem caused the Stripe payment which I know solved by redirecting to the external browser, which is not the best but was my only viable solution:

if(requestUrl.absoluteString.contains("stripe")){
  UIApplication.shared.open(requestUrl)
  decisionHandler(.cancel)
} else {
  decisionHandler(.allow)
}

Today after the big migration was more relaxing finishing new demo images for Play and App Store and cleaning some junk codes.

Cheers
Hannes

2 Likes

Hi all,

the 2. eurobee fair is over. Had a fun time but it reminded me that I’m no sales person and don’t like places with a lot of people :smiley:.

For my master thesis presentation a few people showed up. I started with only 3 but the amount of listeners gradually increased, which made me quite happy.

Cheers
Hannes

2 Likes

Hi all,

just wanted to give some project updates and insights into my decision/problems etc.

Session storage changed to Redis

Previously I used my MariaDB as session database, which results in a lot of read and writes. Not that it really matters for my few active users, but I felt like a perfect place to over engineer. Therefore I changed to a Redis database for session storage only, it was my first time working with Redis thx to libraries like github.com/luin/ioredis it was actually fairly easy.

As I did not wanted to drop any session when changing storage I had a script to write the mariaDb session to Redis, this was really overkill as probably no one would mind to be dropped out of the app once but was an experience by itself and gave me a better understand for it. As GitHub (never) forgets here is the old commit were I deleted the code to transfer the session: commit/f101de207eacf3b6dbf6a3926aefa59b5e0f1029

Removed Browser Outdated Information

I though it is a nice feature to remind users that their browser is outdated, therefore I had a simple third party script installed to inform them with a fancy popup browser-update.org/. But it seemed that my users were more irritated by it and sadly by its own story Android phones which gets no upgrades would always see the update notice. Which is a common problem… therefore I removed this feature.

PWA

I have to say I still love PWA (progressive web apps) but it only works great with chromium browser and to be honest its a mess by itself. In the last update for iOS they made a few changes for PWA support, but it broke my app for a few users completely. Handling outdated content was not working anymore and users were stuck in an endless loop. In addition workbox which I use for building my PWA is in a little bit stale mode, it is made by google but did not get any love recently. This may change as discussed in a GitHub issue but my hopes are slim: GoogleChrome/workbox/issues/3149.

Nevertheless, my current workaround is to have PWA disabled for all iOS devices. As I only use a few PWA features it does not hurt. PWA was my road down for supporting full offline mode, but I probably will never archive this anyway.

LLM, OpenAI

Last but not least. Today I had a full free coding day and wanted to play around with an idea which I had a long time ago. Adding my documentation in front of an LLM to answer user questions. The idea itself is nothing new and a few companies already have it implemented on their homepage. Still wanted to ride the band wagon a little bit.

Most guides are in python, therefore I set up a small playground for myself: github.com/HannesOberreiter/wizbee

Probably the only “fancy” difference is that I use Redis as my vector database, as I had such a good experience with it. After getting my documentation, ready and pasted into the vector database. I also connected some public available online resources on beekeeping. With this help the bot can also answer beekeeping specific answers.

The playground was easy to setup, thankfully as there are already lots of tutorials and using github.com/hwchase17/langchain. Getting it to work on my live node server and good in different languages took me on the other hand quite a bit, as I could not find any Node&Redis&Vector guides and the Redis documentation was a little bit over my head sometimes. But somehow I finished it today, but also a little bit burned out. Here is my class for prompt engineering and fetching from the vector database: btree_server/blob/main/src/api/services/wizbee.service.ts

Here is a short demo video:

Cheers
Hannes

2 Likes