Pages

Saturday, 7 November 2015

NetSecL VM - openSuse with VirtualBox + phpvirtualbox

A year ago or so I wrote about Virtualbox and phpvitualbox as a Virtualization solution:

http://yuriystanchev.blogspot.bg/2014/06/virtualbox-and-phpvirtualbox.html

Now I am delighted to present to you NetSecL VM which is based on this article and to my knowledge works great! You can have a preview here on this video:




It is small and based on OpenSuse all VMs autostart on reboot. You can download it form here. The image is for USB so you can directly boot from the desired hardware without using your exiting partitions. 

https://susestudio.com/u/istanchev

Or here:
http://rsync.netsecl.com/

Share your opinion and let me know what can get better!

Yuriy

Monday, 5 October 2015

NetSecL OS v 6

This version of the OS is a general purpose release unlike the previous versions where you had Penetration tools and GrSecurity.

Download: http://netsecl.com/download

You can still install Grsecurity from here:
https://github.com/kdave/openSUSE-gardened/wiki/openSUSE-gardened




And we will keep v5 on our FTP servers in case any penetration tools are needed.

The main idea is to devide the project in several directions:
- General Purpose OS (NetSecL OS)
- Virtualization (NetSecL VM)
- Tools (NetSecL Toolset) - in progress of renewal

Below the included applications in this version of the software.
On tests this was the RAM Consumption: 200 MB and CPU usage: 3 %
Hardware compatability

Accessories:
 - Galculator
 - Leafpad
 - Parcellite
 - PDF Mod

Graphics:
 - gOCR
 - Gtkam
 - Image Viewer
 - Pinta
 - Simple Scan

Internet:
 - Chromium
 - Evolution
 - FileZilla
 - Pidgin
 - Remmina (RDP, VNC, XDMCP)
 - Transmission
 - uGet
 - Xmrt
 - Zenmap

Multimedia:
 - Alsa Mixer
 - Brasero
 - Rythmbox
 - VLC

Office:
 - AbiWord
 - Gnumeric

System:
 - GParted
 - HTop
 - MC
 - Sensor Viewer
 - Task Manager
 - WireShark

Other:
 -PuTTy

Wine

Saturday, 29 August 2015

Visual Studio Code and ASP.NET v5 on Linux

I have to say i never expected I would say this words for Microsoft, but I'm completely AMAZED by what they have done. Visual Studio Code is a very powerful and multi-platform tool. There are tons of articles on how to run ASP.NET v5 on Linux and not only this, but as far as I see there is a possibility to integrate PostgreSQL, the other possibility for the moment would be to use MS SQL on a separate server.

Watch the official movie from Microsoft:



That is really impressive and not only that, have a look  at this articles:
http://blog.markrendle.net/fun-with-asp-net-5-and-docker/
http://blog.tonysneed.com/2015/05/25/develop-and-deploy-asp-net-5-apps-on-linux/


I am sure this will be not the last that we see from this technology and that it will further develop on Linux.

References:
https://code.visualstudio.com
http://blog.markrendle.net/fun-with-asp-net-5-and-docker/
http://blog.tonysneed.com/2015/05/25/develop-and-deploy-asp-net-5-apps-on-linux/

Saturday, 1 August 2015

10 Windows tools that will make your life easier

Here are 10 free tools I have used during my practice and have proved to be useful.

If you are a system administrator you will notice that windows time synchronization is not that reliable. However there is a solution to this  NetTime:
http://www.timesynctool.com/

If you have problems figuring out what takes up the space you can always count on TreeSize:
https://www.jam-software.com/treesize_free/

A Windows replacement for Grep - AgentRansack:
https://www.mythicsoft.com/agentransack

Ever got connectivity issues? The problem is Windows ping does not have a timestamp, but PingInfoView does:
http://www.nirsoft.net/utils/multiple_ping_tool.html

If you have ever had crashes of your Explorer you can disable the faulty extensions with ShellExView:
http://www.nirsoft.net/utils/shexview.html

If you ever wondered if your USB is faulty or not - you can reset the drivers with USBDeview:
http://www.nirsoft.net/utils/usb_devices_view.html

This is another tool that is really helpful - all storages that you had  problems with can be low level formatted with HDD LLF Low Level Format Tool:
http://hddguru.com/software/HDD-LLF-Low-Level-Format-Tool/

Technitium MAC Address Changer - this is another useful tool that can change your MAC:
https://technitium.com/tmac/

Restoration this saved me couple of times - when I deleted over 15 GB of files:
http://www3.telus.net/mikebike/RESTORATION.html

HjSplit a free file slitter:
http://www.hjsplit.org/

Saturday, 6 June 2015

Why 10 bucks are worth more than 60$ (RX-WR150WH)

Update - 2 years later 2017/07/12: After some analysis I see that the router is a re-brand of TOTO Link router, but it did it's purpose - now it is gathering dust, Mikrotik's WIFI range coverage is just unbeatable by stability and bandwidth.

I bought a router RX-WR150WH for 10$ the brand is Serioux and is a Romania brand - which sells here in Bulgaria pretty cheap. Why would I possibly do this over a Linksys WRT54GL!? Well first it was quite old already and the second thing is it was quite huge which is a problem in the moment on my desk. And third I did not want to trash my WRT54GL experimenting on it.

Why I choose this router first:
1. It is very compact.
2. It has a lot of features for the money: http://www.serioux.com/uploads/support/common/7a874-SRX-WR150WH---Declaratie-CE.pdf
It even has NAT and a firewall.
3. The price.
4. Although it has nothing to do with Mikrotik it does a lot for 10$ compared to 60$. 
5. The speed:



6. Obviously a linux:
File Type: u-boot legacy uImage, zxrouter, Linux/MIPS, Standalone Program (Not compressed), 713440 bytes, Tue Apr 23 20:26:05 2013, Load Address: 0x80300000, Entry Point: 0x80300000, Header CRC: 0xD022E028, Data CRC: 0x3ED5B432

7.Try Exporting the configuration. You might want to have a look at the weak passwords and these:

L2T_MTU=4076
WAN_MTU=4076

References: http://www.slatedroid.com/topic/76777-how-to-extract-u-boot-bootimg-for-amlogic-mx-devices/
http://www.checkfiletype.com/upload-and-check
http://www.devttys0.com/2011/05/reverse-engineering-firmware-linksys-wag120n/

Raspberry and arch linux

I got a raspberry Pi B+ and decided to turn it into a web server for the moment. I installed Arch-linux and have to say that it pretty much reminds me of Slackware although they are not the same.

By the way you can build your own case:
http://sixes.net/rdcHQ/rdchq-extra-credit-raspi-b/
However my own experience is that it wasn't quite what it should be ... if you really want something that lasts buy a case.

First thing you  have to do after you log in is to update the package repository:
pacman -Syu
pacman-db-upgrade

After this we can start installing packages:
pacman -S mc
pacman -S iptraf-ng
pacman -S nginx
pacman -S php php-fpm
pacman -S whois 
pacman -S dnsutils
pacman -S screen

To remove a package and its dependencies which are not required by any other installed package use:
pacman -Rs package_name

Let us enable nginx and php on boot:
systemctl enable nginx php-fpm

Then you might get the following error:
May 24 06:47:55 alarmpi systemd[1]: Starting A high performance web server .....
May 24 06:47:55 alarmpi nginx[19638]: 2015/05/24 06:47:55 [emerg] 19638#0: ...m)
May 24 06:47:55 alarmpi systemd[1]: nginx.service: control process exited, ...=1
May 24 06:47:55 alarmpi systemd[1]: Failed to start A high performance web ...r.
May 24 06:47:55 alarmpi systemd[1]: Unit nginx.service entered failed state.
May 24 06:47:55 alarmpi systemd[1]: nginx.service failed.
Hint: Some lines were ellipsized, use -l to show in full.
[root@alarmpi ~]# systemctl status nginx -l
* nginx.service - A high performance web server and a reverse proxy server
   Loaded: loaded (/usr/lib/systemd/system/nginx.service; enabled; vendor preset: disabled)
   Active: failed (Result: exit-code) since Sun 2015-05-24 06:47:55 MDT; 26s ago
  Process: 19638 ExecStart=/usr/bin/nginx -g pid /run/nginx.pid; error_log stderr; (code=exited, status=1/FAILURE)

The solution to this is:
In /boot/cmdline.txt, add rw just before rootwait also you might consider enabling ipv6 ipv6.disable=0 later ufw will need this.

Let's have a look at the web server configuration:
/etc/nginx/nginx.conf

#user html;
worker_processes  1;

#error_log  logs/error.log;
#error_log  logs/error.log  notice;
#error_log  logs/error.log  info;

#pid        logs/nginx.pid;


events {
    worker_connections  1024;
}


http {
    include       mime.types;
    default_type  application/octet-stream;

    #log_format  main  '$remote_addr - $remote_user [$time_local] "$request" '
    #                  '$status $body_bytes_sent "$http_referer" '
    #                  '"$http_user_agent" "$http_x_forwarded_for"';

    #access_log  logs/access.log  main;

    sendfile        on;
    #tcp_nopush     on;

    #keepalive_timeout  0;
    keepalive_timeout  65;

    #gzip  on;

    server {
        listen       80;
        server_name  localhost;

        #charset koi8-r;

        #access_log  logs/host.access.log  main;

        location / {
            root   /usr/share/nginx/html;
            index  index.html index.htm;
        }

        #error_page  404              /404.html;

        # redirect server error pages to the static page /50x.html
        #
        #error_page   500 502 503 504  /50x.html;
       # location = /50x.html {
       #     root   /usr/share/nginx/html;
       # }

        # proxy the PHP scripts to Apache listening on 127.0.0.1:80
        #
        #location ~ \.php$ {
        #    proxy_pass   http://127.0.0.1;
        #}

        # pass the PHP scripts to FastCGI server listening on 127.0.0.1:9000
        #
        location ~ \.php$ {
            root           /usr/share/nginx/html;
            fastcgi_pass   127.0.0.1:9000;
            fastcgi_index  index.php;
            fastcgi_param  SCRIPT_FILENAME  $document_root$fastcgi_script_name;
            include        fastcgi_params;
        }

        # deny access to .htaccess files, if Apache's document root
        # concurs with nginx's one
        #
        #location ~ /\.ht {
        #    deny  all;
        #}
    }


    # another virtual host using mix of IP-, name-, and port-based configuration
    #
    #server {
    #    listen       8000;
    #    listen       somename:8080;
    #    server_name  somename  alias  another.alias;

    #    location / {
    #        root   html;
    #        index  index.html index.htm;
    #    }
    #}


    # HTTPS server
    #
    #server {
    #    listen       443 ssl;
    #    server_name  localhost;

    #    ssl_certificate      cert.pem;
    #    ssl_certificate_key  cert.key;

    #    ssl_session_cache    shared:SSL:1m;
    #    ssl_session_timeout  5m;

    #    ssl_ciphers  HIGH:!aNULL:!MD5;
    #    ssl_prefer_server_ciphers  on;

    #    location / {
    #        root   html;
    #        index  index.html index.htm;
    #    }
    #}

}

And now about the final touches. It seems that I had some issue with arch running php so after some research I found that  it was the "open_basedir" option in php.ini. You should point that parameter to your document root directory. It should be the same as the "root" option in this section:

cat /etc/php/php.ini
; open_basedir, if set, limits all file operations to the defined directory
; http://php.net/open-basedir
open_basedir =  /usr/share/nginx/html

After some analysis I found that there was some interest ;) in my Raspberry so I secured it with ufw:

ufw allow from <ip> to any port <port number>
ufw status
systemctl enable ufw

The numbers of your firewall rules can listed using:

ufw status numbered

To delete rules:
ufw delete ####
where #### is the number of your firewall rule.



Changing the local time: # ln -s /usr/share/zoneinfo/Europe/Minsk /etc/localtime


Other nice ideas please see the references. Cheers!

References: 
http://blog.tersmitten.nl/ufw-delete-firewall-rules-by-number.html
http://guides.webbynode.com/articles/security/ubuntu-ufw.html


Monday, 11 May 2015

Crontab and wget

I had the following issue - I had to reduce a pretty big xml file to a small file to be downloaded by a mobile app. What I did was pretty simple. In a crontab on my server I set a job for xml compression:

* * * * * root /get.sh >> /tmp/getsh.out 2>&1

In the script:
wget -O /var/www/html/test.xml "http://<domain/ip>/?xml&param=1param&papmm=2param"
bzip2 -c /var/www/html/test.xml > /var/www/html/test.xml.bz2
rm -f /var/www/html/test.xml

So this downloads the xml, bzip shrinks the size and this can be downloaded.