Change modules loading priority - embedded

I would like to know the right way to change modules loading priorities in Linux. I want to have hdmi and LCD output the most quickly.
For now it take 3 seconds to came, I know it's not delay due to hdmi or TV because the first stuff I see on screen is some lines about mali init (mali is the name of the GPU here).
I use a A10-Olinuxino-Lime board with an homemade rootfs generated using buildroot and a custom Linux made for this kind of processor (linux-sunxi).
The tree of /etc/:
etc/
├── dhcp
│   ├── dhclient.conf
│   └── dhcpd.conf
├── dropbear
├── fstab
├── group
├── hostname
├── hosts
├── init.d
│   ├── rcK
│   ├── rcS
│   ├── S01logging
│   ├── S20urandom
│   ├── S40network
│   ├── S50dropbear
│   ├── S80dhcp-relay
│   ├── S80dhcp-server
│   ├── S80mali
│   └── S99TVOS
├── inittab
├── inittab~
├── inputrc
├── issue
├── ld.so.cache
├── ld.so.conf
├── ld.so.conf.d
├── mtab -> /proc/mounts
├── network
│   ├── if-down.d
│   ├── if-post-down.d
│   ├── if-post-up.d
│   ├── if-pre-down.d
│   ├── if-pre-up.d
│   ├── if-up.d
│   └── interfaces
├── nsswitch.conf
├── os-release
├── passwd
├── profile
├── protocols
├── random-seed
├── resolv.conf -> ../tmp/resolv.conf
├── securetty
├── services
├── shadow
├── ts.conf
└── wpa_supplicant.conf
Do you have any ideas ?

I'd create an /etc/init.d/S00modules script containing a sequence of insmod (or modprobe if your env supports it) lines.
If that doesn't help, then your modules are loaded even earlier,and you'll have to find how and where that happens. I'd first look at /sbin/init or what is used instead.

Related

Unable to download Tensorflow C Language Bindings as an ExternalProject

I'm trying to include the C language bindings for Tensorflow found at https://storage.googleapis.com/tensorflow/libtensorflow/libtensorflow-gpu-linux-x86_64-2.5.0.tar.gz in my CMake Project. Unfortunately, it seems as thought nothing is being downloaded, as the TENSORFLOW-prefix/src/TENSORFLOW directory is empty. I'm new to CMake and am not sure where I am going wrong. Any help would be appreciated.
Relevant source:
cmake_minimum_required(VERSION 3.17)
include(ExternalProject)
project(tfexec)
set(CMAKE_CXX_STANDARD 17)
ExternalProject_Add(TENSORFLOW
URL "https://storage.googleapis.com/tensorflow/libtensorflow/libtensorflow-gpu-linux-x86_64-2.5.0.tar.gz"
CONFIGURE_COMMAND ""
BUILD_COMMAND ""
INSTALL_COMMAND ""
LOG_DOWNLOAD 1
LOG_CONFIGURE 1
LOG_BUILD 1
LOG_INSTALL 1
)
ExternalProject runs at build time, so you actually need to run your build for this to do anything. This is what I see; it seems to be working fine:
alex#Alex-Desktop:~/test$ cmake-3.17 -S . -B build -DCMAKE_BUILD_TYPE=Release
...
-- Configuring done
-- Generating done
-- Build files have been written to: /home/alex/test/build
alex#Alex-Desktop:~/test$ cmake --build build/ -- -v
...
alex#Alex-Desktop:~/test$ tree build/TENSORFLOW-prefix/src/TENSORFLOW
build/TENSORFLOW-prefix/src/TENSORFLOW
├── LICENSE
├── THIRD_PARTY_TF_C_LICENSES
├── include
│   └── tensorflow
│   ├── c
│   │   ├── c_api.h
│   │   ├── c_api_experimental.h
│   │   ├── c_api_macros.h
│   │   ├── eager
│   │   │   ├── c_api.h
│   │   │   ├── c_api_experimental.h
│   │   │   └── dlpack.h
│   │   ├── tensor_interface.h
│   │   ├── tf_attrtype.h
│   │   ├── tf_datatype.h
│   │   ├── tf_file_statistics.h
│   │   ├── tf_status.h
│   │   ├── tf_tensor.h
│   │   └── tf_tstring.h
│   └── core
│   └── platform
│   ├── ctstring.h
│   └── ctstring_internal.h
└── lib
├── libtensorflow.so -> libtensorflow.so.2
├── libtensorflow.so.2 -> libtensorflow.so.2.5.0
├── libtensorflow.so.2.5.0
├── libtensorflow_framework.so -> libtensorflow_framework.so.2
├── libtensorflow_framework.so.2 -> libtensorflow_framework.so.2.5.0
└── libtensorflow_framework.so.2.5.0
7 directories, 23 files

IntelliJ import subfolder as project

Having a code structure like below which contains documentation at the root level how can I TELL IDEA to import the Code/spark subfolder as a project?
.
├── Code
│   ├── foo
│   │   ├── bar
│   │   └── baz
│   └── spark
│   ├── build.sbt
│   ├── common
│   ├── job1
│   ├── project
│   ├── run_application
│   ├── sbt
│   ├── scalastyle-config.xml
│   └── version.sbt
├── Docs
You need to add Content Root, go to Project Structure settings (Ctrl + Alt + Shift + s), select your module, then on the right panel click Add Content Root and select Docs folder. Then you can select it and mark as part of the Module, for documentation I believe it should be Resources.
Even better: use a proper build tool like gradle and then apply https://docs.gradle.org/current/userguide/composite_builds.html

cmake out of source build not working for nested source directories

I am having a directory structure of my source code as follows:
.
├── build
├── CMakeLists.txt
├── libs
│   ├── CMakeLists.txt
│   ├── ext
│   │   └── include
│   │   └── json.hpp
│   └── int
│   ├── CMakeLists.txt
│   └── net
│   ├── CMakeFiles
│   │   ├── CMakeDirectoryInformation.cmake
│   │   └── progress.marks
│   ├── cmake_install.cmake
│   ├── CMakeLists.txt
│   ├── include
│   │   ├── connection.hpp
│   │   └── conn_manager.hpp
│   ├── Makefile
│   └── src
│   ├── CMakeFiles
│   │   ├── CMakeDirectoryInformation.cmake
│   │   ├── libs.dir
│   │   │   ├── build.make
│   │   │   ├── cmake_clean.cmake
│   │   │   ├── cmake_clean_target.cmake
│   │   │   ├── DependInfo.cmake
│   │   │   ├── depend.make
│   │   │   ├── flags.make
│   │   │   ├── link.txt
│   │   │   └── progress.make
│   │   └── progress.marks
│   ├── cmake_install.cmake
│   ├── CMakeLists.txt
│   ├── connection.cpp
│   ├── conn_manager.cpp
│   └── Makefile
└── main
├── CMakeLists.txt
├── config
│   └── config.json
└── srcs
├── CMakeLists.txt
├── main.cpp
├── samvaadak.cpp
└── samvaadak.hpp
"build" is a folder from which I am running cmake. It runs and creates the final output executable inside build/dist/bin. But it also creates lot of intermediate inside the source tree (libs and main) and making it cluttered.
The top level CMakeLists.txt file looks like this.
project (samvaadak)
subdirs(main libs)
set (CMAKE_CXX_STANDARD 11)
set(CMAKE_LIBRARY_OUTPUT_DIRECTORY ${CMAKE_BINARY_DIR}/dist/lib)
set(CMAKE_ARCHIVE_OUTPUT_DIRECTORY ${CMAKE_BINARY_DIR}/dist/lib)
set(CMAKE_RUNTIME_OUTPUT_DIRECTORY ${CMAKE_BINARY_DIR}/dist/bin)
Intermediate CMakeLists.txt files have the following content:
subdirs(src)
And, the CMakeLists.txt file for inner folder(s) where the source files are present, is like this:
include_directories(${samvaadak_SOURCE_DIR}/libs/int/net/include)
include_directories(${samvaadak_SOURCE_DIR}/libs/ext/include)
add_library(nettu conn_manager.cpp connection.cpp)
set(CMAKE_LIBRARY_OUTPUT_DIRECTORY ${CMAKE_BINARY_DIR}/dist/lib)
set(CMAKE_ARCHIVE_OUTPUT_DIRECTORY ${CMAKE_BINARY_DIR}/dist/lib)
set(CMAKE_RUNTIME_OUTPUT_DIRECTORY ${CMAKE_BINARY_DIR}/dist/bin)
How to tell cmake that build outside of libs and main folders and just build everything inside build?
You can disable in source builds:
set(CMAKE_DISABLE_IN_SOURCE_BUILD ON)
Also recommended:
set(CMAKE_DISABLE_SOURCE_CHANGES ON)
They need to be used before the first project() command.

Which elm package versions are installed?

elm-package can manage dependencies for elm, but the only commands it supports (as of version 0.18.0) are install, publish, bump and diff according to running it without arguments. I was expecting something like elm-package list to show the installed packages.
Is there a command to list the currently installed elm package versions?
I think there is no one, but you can execute tree elm-stuff/packages -L 3 --noreport in your command line.
You will get a tree like this:
elm-stuff/packages
├── debois
│   ├── elm-dom
│   │   └── 1.2.3
│   └── elm-mdl
│   └── 8.1.0
├── elm-lang
│   ├── core
│   │   └── 5.1.1
│   ├── dom
│   │   └── 1.1.1
│   ├── html
│   │   └── 2.0.0
│   ├── http
│   │   └── 1.0.0
│   ├── mouse
│   │   └── 1.0.1
│   ├── virtual-dom
│   │   └── 2.0.4
│   └── window
│   └── 1.0.1
├── mgold
│   └── elm-date-format
│   └── 1.2.0
└── thaterikperson
└── elm-strftime
You can also just do cat elm-stuff/exact-dependencies.json, but there is no guarantee of have them installed:
{
"debois/elm-mdl": "8.1.0",
"elm-lang/virtual-dom": "2.0.4",
"elm-lang/mouse": "1.0.1",
"mgold/elm-date-format": "1.2.0",
"elm-lang/dom": "1.1.1",
"elm-lang/html": "2.0.0",
"elm-lang/http": "1.0.0",
"debois/elm-dom": "1.2.3",
"elm-lang/window": "1.0.1",
"elm-lang/core": "5.1.1"
}
If you use the Lighttable editor with the elm-light plug in then you would have a command to show (and add) packages.

ansible: roles structure and variables

I use ansible for a while with standalone playbooks and now would like to configure role structure at my environment.
This is folder structure(example, summarised)
├── hosts
├── playbooks
│   ├── project1-staging.yml
│   └── project1-production.yml
├── roles
│   └── project1-compile
│   ├── files
│   │   └── project1.conf
│   ├── handlers
│   │   └── main.yml
│   ├── meta
│   │   └── main.yml
│   ├── tasks
│   │   └── main.yml
│   ├── templates
│   └── vars
│   └── main.yml
│   └── ec2-create
│   ├── handlers
│   │   └── main.yml
│   ├── meta
│   │   └── main.yml
│   ├── tasks
│   │   └── main.yml
│   ├── templates
│   └── vars
│   └── main.yml
│   └── project1-deploy
│   ├── handlers
│   │   └── main.yml
│   ├── meta
│   │   └── main.yml
│   ├── tasks
│   │   └── main.yml
│   ├── templates
│   └── vars
│   └── main.yml
├── vars.yml
It looks straightforward.
I would like to execute project1-staging.yml playbook to create a new staging environment for specific version, like that;
ansible-playbook project1-staging.yml -e 'version=1'
and playbook below;
---
- name: deploy project1 (staging) {{ version }}
hosts: local
connection: local
roles:
#- project1-compile version={{ version }}
- { role: ec2-create, project: project1, count:1 }
- { role: project1-compile, version: {{ version }} }
- { role: project1-deploy, version: {{ version }}, target: {{last_ec2}} }
There are some problem at this structure and also i don't like it.
- Is that proper way?
- how can i use result of ec2-create role, i would like to deploy codes to server which is just created and i don't know id.
- are there another method to pass parameters to roles?
Take a look at Inventory Modules. More specifically add_host module.
Synopsis
Use variables to create new hosts and groups in inventory for use in
later plays of the same playbook. Takes variables so you can define
the new hosts more fully.
Examples
# add host to group 'just_created' with variable foo=42
- add_host: name={{ ip_from_ec2 }} groups=just_created foo=42
# add a host with a non-standard port local to your machines
- add_host: name={{ new_ip }}:{{ new_port }}
# add a host alias that we reach through a tunnel
- add_host: hostname={{ new_ip }}
ansible_ssh_host={{ inventory_hostname }}
ansible_ssh_port={{ new_port }}