Tuesday, December 5, 2017

SD Card mounts read-only in Linux

The title might be misleading since my setup is using Linux Mint 18.3 but I have seen all kinds of posting related to this problem across Linux forums.  I have tried all kinds of remedies suggested online and it seems to be a hit or miss and cant seem to narrow it down.  Windows does not have any of these symptoms and can heal even should you forget to eject properly.

I have tried all the obvious ones such as changing permission on the filesystem to 777 and removing it as root or sudo.  I had one success with this method then it went back every time I eject the SD card, use it in a speaker player then try to access it again on the computer.

Tried the fsck and did work one time but I am very careful prior to removing the card by ejecting it [umount]. 
fsck -a /dev/sdb

Gparted didn't work.  But if I've continued to try, I think one of these days it would have worked.  Rebooted so many times that I felt like I was using Windows Vista.

The dd command didn't seem to work at all
dd if=/dev/zero of=/dev/sdb bs=1M count=10

I have noticed one time where my SD card had residual trash on the card .Trash-1000; however, nothing can remove it to include root.  Then found out that you have to delete the user's Trash then umount and remount.  This worked once then failed again.
rm -rf ~/.local/share/Trash/* 

A different scenario happened to where I plug in then eject then insert again then eventually it worked.  From Nemo file manager is whether or not I had the delete option which indicated that the SD card is now in read-write mode.

I have tried umounting then remount with rw options:
sudo fdisk -l
sudo mount -t vfat -o rw /dev/sdb1 /mnt
or
sudo mount -o remount,rw /dev/sdb1 /media/username/usbname

Finally I really think that by adding myself and the root into the disk group worked.
sudo usermod -G disk --append username
sudo usermod -G disk --append root

Thursday, November 23, 2017

Bitnami Redmine Installation on Ubuntu 16.04

Bitnami Redmine Installation on Ubuntu 16.04

Using Bitnami Redmine, go to https://bitnami.com/stack/redmine to download the latest version of Redmine local installer for Linux.
Upload the binary to the server then run it as root
chmod +x bitnami-redmine-<version>-linux-x64-installer.run

./bitnami-redmine-<version>-linux-x64-installer.run

Made a symlink for easy access

ln -s /opt/redmine<version> /opt/redmine

Start/Stop Service

The native installer also includes a command-line script to start, stop and restart applications, named ctlscript.sh. This script can be found in the installation directory and accepts the options startstoprestart, and status. To use it, log in to the server console and execute it following the examples below:
  • Call it without any service names to start all services:
    Note: Only use sudo if the stack was installed as root
    sudo installdir/ctlscript.sh start
    
  • Use it to restart a specific service only by passing the service name as argument - for example, mysql or apache:
    Note: Only use sudo if the stack was installed as root
    sudo installdir/ctlscript.sh restart mysql
    sudo installdir/ctlscript.sh restart apache
    
  • Obtain current status of all services:
    installdir/ctlscript.sh status
    
The list of available services varies depending on the required components for each application.

Email Fetching from Gmail

Install rake to provide /usr/bin/rake
sudo apt-get install rake
Crontab
*/2 5-22 * * * /opt/redmine/helpdeskEmailPoll.sh > /opt/redmine/helpdeskEmailPoll.log
File: /opt/redmine/helpdeskEmailPoll.sh
#!/bin/bash
source /opt/redmine/scripts/setenv.sh
cd /opt/redmine/apps/redmine/htdocs

RAILS_ENV="production" /usr/bin/rake redmine:email:receive_imap host=imap.gmail.com port=993 ssl=1 username=helpdesk@ssis.edu.vn password=h3lpdesk\! project=helpdesk unknown_user=create no_permission_check=1 no_account_notice=1
To check when the last time the cronjob to check the helpdesk email, go to /opt/redmine/helpdeskEmailPoll.log

Use a MySQL trigger to "fix" the Active Directory user auto-creation

A bit of a hack, but it gets the job done. Run this using the "/opt/redmine/mysql/bin/mysql -uroot -p" command line client:
delimiter //
CREATE TRIGGER `user_before_insert` BEFORE INSERT
ON `users`
FOR EACH ROW BEGIN
IF NEW.login like '%@ssis.edu.vn' THEN
SET NEW.login = replace(NEW.login,'@ssis.edu.vn','');
SET NEW.auth_source_id= (SELECT id from auth_sources where type="AuthSourceLdap" and name like "SSIS%" limit 1);
END IF;
END//

CREATE TRIGGER `user_before_update` BEFORE UPDATE
ON `users`
FOR EACH ROW BEGIN
IF NEW.login like '%@ssis.edu.vn' THEN
SET NEW.login = replace(NEW.login,'@ssis.edu.vn','');
SET NEW.auth_source_id= (SELECT id from auth_sources where type="AuthSourceLdap" and name like "SSIS%" limit 1);
END IF;
END//

CREATE TRIGGER `user_after_insert` AFTER INSERT
ON `users`
FOR EACH ROW BEGIN
IF NEW.auth_source_id='1' THEN
SET @groupid := (SELECT DISTINCT id from users where type="Group" and lastname="SSIS Users");
INSERT INTO groups_users (group_id, user_id) VALUES (@groupid, NEW.id);
END IF;
END//

delimiter ;

Accessing PhpMyAdmin On Linux And Mac OS X

To access the application using your Web browser, create an SSH tunnel, as described below.
  • Open a new terminal window on your local system (for example, using "Finder -> Applications -> Utilities -> Terminal" in Mac OS X or the Dash in Ubuntu).
  • You have two options to configure the SSH tunnel: connect to the server using a private key (recommended) or connect to the server using a SSH password. Follow the instructions below per each option:
    • Option 1: Connect to the server using a private key
      • Make sure that you have your SSH credentials (.pem key file) in hand.
      • Run the following command to configure the SSH tunnel. Remember to replace KEYFILE with the path to your private key and SERVER-IP with the public IP address or hostname of your server:
        ssh -N -L 8888:127.0.0.1:80 -i KEYFILE syad@srvr-uredmine.ssis.edu.vn
        
    • Option 2: Connect to the server using a SSH password
      • Run the following command, remembering to replace SERVER-IP with the public IP address or hostname of your server. Enter your SSH password when prompted.
        ssh -N -L 443:127.0.0.1:443 sysad@srvr-uredmine.ssis.edu.vn
        
NOTE: If successful, the above commands will create an SSH tunnel but will not display any output on the server console.
  • Access the phpMyAdmin console through the secure SSH tunnel you created, by browsing to https://127.0.0.1/phpmyadmin.
  • Log in to phpMyAdmin by using the following credentials:
    • Username: root
    • Password: application password

File Attachments

Every year, Redmine will create a new directory under /opt/redmine/apps/redmine/htdocs/files/<year> with the permission ownership of root.root.  We need to change it to daemon.daemon in order for our users to upload new files.  We need to keep in mind to perform this every year or write a cronjob to perform this task.

Tuesday, November 14, 2017

Redirect multiple domain with NGINX

Web server

(web.example.com):  /etc/nginx/sites-available/redirect
server {
listen 80;
server_name records.example.com summer.example.com helpdesk.example.com;
if ($host = 'records.example.com'){
return 301 http://docs.google.com/;
}
if ($host = 'summer.example.com'){
return 301 http://domainsomewhere.net;
}
if ($host = 'helpdesk.example.com'){
return 301 http://youtube.com/sdfasa;
}
}


ln -s /etc/nginx/sites-available/redirect /etc/nginx/sites-enabled/redirect
nginx -t
service nginx restart

DNS

DNS entry point to webserver (ie. records alias to web.example.com)

Thursday, November 2, 2017

Check website page load and trace route

Here are two very nice website page load check and trace route.  It also provide recommendations on how to improve the page's performance.

http://www.monitis.com/pageload/
https://tools.pingdom.com

Thursday, October 26, 2017

Output Result from df Command STDOUT to HTML

An example of where we need this is to display the output from the command df to a webpage. All the spaces and newline are all disregarded by HTML so it will display as one long line.

PRE Block in HTML

Solution: Wrap output such as this in a pre block in HTML.
STORAGE=$(df -PTh | column -t | sort -n -k6n)
echo "<pre>$STORAGE</pre>"
Though, HTML does not take care of preserving white space

To preserve the white space

In the output of the command chain is white space in the form of spaces and newlines (and you are lucky there are no tabs). You should make pipe the output into | sed 's/ /&nbsp;/g' | sed 's/^/<br>/':
STORAGE=$(df -PTh | column -t | sort -n -k6n)| sed 's/ /&nbsp;/g' | sed 's/^/<br>/'
to preserve whitespace. You can use that without getting the font changing effect that <pre>induces.

HTML Table Format

$ printf "<pre>%s</pre>\n" "$storage" >> file.html
There should be no need to include column. This is a candidate for a HTML table, and could be begotten by something like:df -PTh | \
sed '1d' | \ sort -n -k6 | \ awk ' { printf "\n\t<tr>"; for (n = 1; n < 7; ++n) printf("\n\t<td>%s</td>",$n); printf "\n\t<td>"; for(;n <= NF; ++n) printf("%s ",$n); printf "</td>\n\t</tr>" } '
Wrap it in something like:
<!DOCTYPE html>
<html>
<head>
<meta charset="UTF-8">
<title>Disk space usage</title>
<style>
table, td, th {
  border : 1px solid green;
}
th {
  background-color: green;
  color : white;
}
</style>
</head><body>
<table>
  <tr>
    <th>Filesystem</th>
    <th>Type</th>
    <th>Size</th>
    <th>Used</th>
    <th>Avail</th>
    <th>Use%</th>
    <th>Mounted on</th>
  </tr>
  <!-- df output awk table -->
  <?php include('file.html'); ?>
</table>
</body>
</html>

Tuesday, October 24, 2017

Disable/Delete Inactive AD Users

Export to CSV file of Inactive users of over 90 days

Dsquery user "OU=Faculty Staff, OU=All Users,DC=SSIS,DC=EDU,DC=VN" –inactive 90  > inactiveUsers.csv

alternately

Locate Inactive Accounts 

Use Powershell run as Administrator
To find users that have not logged into the system and have not reset password for the last 90 days:
$90Days = (get-date).adddays(-90)
Get-ADUser -SearchBase "OU=Faculty Staff, OU=All Users,DC=SSIS,DC=EDU,DC=VN" -filter {(lastlogondate -notlike "*" -OR lastlogondate -le $90days) -AND (passwordlastset -le $90days) -AND (enabled -eq $True)} -Properties lastlogondate, passwordlastset | Select-Object name, lastlogondate, passwordlastset| export-csv inactiveUsers.csv –notypeinformation

Disable the Inactive Accounts

This will update the description [with a date to help determine the deleted date] and disable the account with using the PassThru switch:

Get-ADUser -SearchBase "OU=Faculty Staff, OU=All Users,DC=SSIS,DC=EDU,DC=VN" -filter {lastlogondate -le $90days -AND passwordlastset -le $90days} -Properties lastlogondate, passwordlastset | set-aduser -Description ((get-date).toshortdatestring()) –passthru | Disable-ADAccount

Delete Inactive Accounts

Now that we have all the accounts disabled, we need to delete them. We can use the Remove-ADObject cmdlet to delete the account, and then use Get-ADUser to read the Description attribute. To compare the date that the account was disabled to the current date, we can use Where-Object, as shown here:

$14days = (get-date).adddays(-14)
Get-Aduser -SearchBase "OU=Faculty Staff, OU=All Users,DC=SSIS,DC=EDU,DC=VN" -Filter {enabled -eq $False} -properties description | Where { (get-date $_.Description) -le $14Days} | remove-adobject

The command sed will prompt for every user before deleting the accounts. To get a list, you can use WhatIf, or if you do not want to get prompted, you can use Confirm:$False, as shown here:

Get-Aduser -SearchBase "OU=Faculty Staff, OU=All Users,DC=SSIS,DC=EDU,DC=VN" -Filter {enabled -eq $False} -properties description | Where { (get-date $_.Description) -le $14Days} | remove-adobject –whatif

Get-Aduser -SearchBase "OU=Faculty Staff, OU=All Users,DC=SSIS,DC=EDU,DC=VN" -Filter {enabled -eq $False} -properties description | Where { (get-date $_.Description) -le $14Days} | remove-adobject –confirm:$False

All-in-One Script

#import the ActiveDirectory Module
Import-Module ActiveDirectory
#Create a variable for the date stamp in the log file
$LogDate = get-date -f yyyyMMddhhmm
#Sets the OU to do the base search for all user accounts, change for your env.
$SearchBase = "OU=Faculty Staff, OU=All Users,DC=SSIS,DC=EDU,DC=VN"
#Create an empty array for the log file
$LogArray = @()
#Sets the number of days to delete user accounts based on value in description field
$Disabledage = (get-date).adddays(-14)
#Sets the number of days to disable user accounts based on lastlogontimestamp and pwdlastset.
$PasswordAge = (Get-Date).adddays(-90)
#RegEx pattern to verify date format in user description field.
$RegEx = '^(0[1-9]|1[012])[- /.](0[1-9]|[12][0-9]|3[01])[- /.](20)\d\d$'
#Use ForEach to loop through all users with description date older than date set. Deletes the accounts and adds to log array.
ForEach ($DeletedUser in (Get-Aduser -searchbase $SearchBase -Filter {enabled -eq $False} -properties description ) ){
  #Verifies description field is in the correct date format by matching the regular expression from above to prevent errors with other disbaled users.
  If ($DeletedUser.Description -match $Regex){
    #Compares date in the description field to the DisabledAge set.
    If((get-date $DeletedUser.Description) -le $Disabledage){
      #Deletes the user object. This will prompt for each user. To suppress the prompt add "-confirm:$False". To log only add "-whatif".
      Remove-ADObject $DeletedUser
        #Create new object for logging
        $obj = New-Object PSObject
        $obj | Add-Member -MemberType NoteProperty -Name "Name" -Value $DeletedUser.name
        $obj | Add-Member -MemberType NoteProperty -Name "samAccountName" -Value $DeletedUser.samaccountname
        $obj | Add-Member -MemberType NoteProperty -Name "DistinguishedName" -Value $DeletedUser.DistinguishedName
        $obj | Add-Member -MemberType NoteProperty -Name "Status" -Value 'Deleted'
        #Adds object to the log array
        $LogArray += $obj
    }
  }
}
#Use ForEach to loop through all users with pwdlastset and lastlogontimestamp greater than date set. Also added users with no lastlogon date set. Disables the accounts and adds to log array.
ForEach ($DisabledUser in (Get-ADUser -searchbase $SearchBase -filter {((lastlogondate -notlike "*") -OR (lastlogondate -le $Passwordage)) -AND (passwordlastset -le $Passwordage) -AND (enabled -eq $True)} )) {
  #Sets the user objects description attribute to a date stamp. Example "11/13/2011"
  set-aduser $DisabledUser -Description ((get-date).toshortdatestring())
  #Disabled user object. To log only add "-whatif"
  Disable-ADAccount $DisabledUser
    #Create new object for logging
    $obj = New-Object PSObject
    $obj | Add-Member -MemberType NoteProperty -Name "Name" -Value $DisabledUser.name
    $obj | Add-Member -MemberType NoteProperty -Name "samAccountName" -Value $DisabledUser.samaccountname
    $obj | Add-Member -MemberType NoteProperty -Name "DistinguishedName" -Value $DisabledUser.DistinguishedName
    $obj | Add-Member -MemberType NoteProperty -Name "Status" -Value 'Disabled'
    #Adds object to the log array
    $LogArray += $obj
}
#Exports log array to CSV file in the temp directory with a date and time stamp in the file name.
$logArray | Export-Csv "C:\Temp\User_Report_$logDate.csv" -NoTypeInformation

Monday, October 23, 2017

Inkscape crashes every time opening existing svg file

Symptom

Every time I open an existing SVG generated by Inkscape, an error:
terminate called after throwing an instance of 'Glib::ConvertError' 
Emergency save activated!
Emergency save completed. Inkscape will close now.
Note to get this error, you must launch Inkscape from a terminal.  If you launch it from the GUI, you will only get a generic error that Inkscape has crashed and will close now.  Useless piece of information.

Solution

Remove rm $HOME/.local/share/recently-used.xbel

Thursday, October 19, 2017

Banshee lost sound after HDMI usage

I had used the HDMI port to play a movie onto the TV and of course I used the HDMI as the sound output.  By unplugging the HDMI cable from the TV, Banshee no longer was able to produce any sound through the built-in speakers, yet I had no way to release the output from the HDMI.

The solution was to plug my laptop back into the TV via HDMI then set the sound's output back to the built-in speaker before unplugging the HDMI.

System: Lenovo Ultrabook 15 Flex, Linux Mint 18.1, Banshee 2.6.3

Tuesday, October 17, 2017

Find missing value from one file to another

To find a missing line (value) from one file compared to the other.

Using grep by combining the -v (show non-matching lines), -x (match whole lines) and -f (read patterns from file) options:
grep -v -x -f B.txt A.txt
This does not depend on the order of the files - it will remove any lines from A that match a line in B.

Friday, October 13, 2017

Ubuntu free up /boot to upgrade

Linux Kernel older images can take up all your /boot partition.  BUT there are times where the /boot is already at 100% hence you cannot perform any autoremove nor upgrade.
sudo apt-get -f autoremove
lsb_release -a;uname -a;dpkg -l|grep linux-image
Note the current used Linux kernel image so avoid it. Don't worry, if your current image has a dependency, it will warn you and will not remove them.
dpkg --purge <linux-image-not-needed> <linux-image-extra-not-needed>
sudo apt-get -f autoremove 


Wednesday, October 4, 2017

Pitfall of Ubuntu 14.04 to 16.04 Upgrade

Here are some notes that really can get you in trouble with the Ubuntu 14.04 to 16.04 upgrade:

In Ubuntu 16.04, important changes since the preceding LTS release include a transition to the systemd init system in place of Upstart, an emphasis on Python 3 support, and PHP 7 in place of PHP 5.

01.  In a VMware 6.0 environment, using the VMXNET3 ethernet adapter; the interface name will change.  To fix this, go find the new interface name with
lshw -C Network
then fix the /etc/network/interfaces with the new interface name then
ifup <interface_name>
02. Django will definitely break.  To satisfy the dependencies, locate the requirements.txt then
sudo apt-get install python-setuptools
sudo easy_install --upgrade django

pip install --upgrade pip
pip install -r <path/to/requirements.txt>

03. Nginx will also break with the new /etc/nginx/nginx.conf so you might have to use the old nginx.conf.dpkg-old backup 

04.  For Wordpress using NGINX, here is an article that covers all the necessary steps: https://thecustomizewindows.com/2016/09/upgrade-ubuntu-server-14-04-16-04-live-wordpress/

Monday, September 25, 2017

Replace HTTP to HTTPS

To quickly replace all of the http to https within a file can be easily done by
sed -i 's|http:|https:|g' *.html
 
or find
find . -name "*.html" -exec sed -i 's|http:|https:|g' {} \; 

Thursday, September 14, 2017

Increase Max Filesize Upload in Wordpress



This address specifically to the Media Library upload limitation within Wordpress

NGINX Configuration

Modified /etc/nginx/nginx.conf within the http section

http {
      # set client body size to 200M #
      client_max_body_size 200M;

PHP Configuration


Modified php.ini with three items within /etc/php/7.0/cli/php.ini and /etc/php/7.0/fpm/php.ini

  • upload_max_filesize = 200MB
  • post_max_size = 200MB
  • memory_limit = 128MB

WP Plugin


Used Increase Upload Max Filesize

Thursday, August 31, 2017

Bulk Resize Images in Ubuntu

ImageMagick Installation
sudo apt-get update
sudo apt-get install imagemagick -y
 Resize to either height or width, keeps proportions using ImageMagick
cd /path/to/output
find </path/to/images> -iname '*.jpg' -o -iname '*.png' -exec convert \{} -resize WIDTHxHEIGHT\> \{} \;

Thursday, August 24, 2017

Thursday, August 17, 2017

Clone HDD Remotely

If your intent is to backup a remote computer's HDD A via SSH to a single file that's on your local computer's HDD, you could do one of the following.

Find the hard drive to backup. Notice that the hard drive is only the letters and not the number since the number is the partitions.
df -h
Run from remote computer
sudo dd if=/dev/sda | gzip -1 - | ssh user@local dd of=image.gz
Run from local computer
sudo ssh user@remote "dd if=/dev/sda | gzip -1 -" | dd of=image.gz

Methods for monitoring?

  1. Login via ssh in another terminal and ls -l the file to see what it's size is.
  2. You can use pv to monitor the progress of a large dd operation, for instance, for the remote example above, you can do:
    $ dd if=/dev/sda | gzip -1 - | pv | ssh user@local dd of=image.gz
    
  3. Send a "SIGUSR1" signal to dd and it will print stats. Something like:
    $ pkill -USR1 dd
    

Recursive Search

Here is the best way to find a pattern within a file recursively:

cd </starting/path
grep -r -i <pattern> ./*

cd <path to start search>
find ./ -type f ! -type d|xargs grep <pattern>

Delete Windows.old from Windows Server



Q: How can I delete the windows.old from an upgraded Windows Server 2012R2?

A: For a client OS that's upgraded, the Disk Cleanup utility can be used to delete the very large windows.old folder containing the old OS. This isn't available on a server OS without installing the Desktop Experience feature.

Command line tools to take ownership and delete the folder . Make sure to “run as administrator” with the cmd [powershell will not work]:
takeown /F c:\Windows.old\* /R /A /D Y
cacls c:\Windows.old\*.* /T /grant administrators:F
rmdir /S /Q c:\Windows.old

! - If you run into problems such as access denied or path file not found, try [remember command terminal must be run as administrator]
cacls c:\Windows.old\*.* /reset /T /C /L /Q

!!!! - When all else fails, use cygwin and remove it with the following:
Open Cygwin as an administrator
cd /cygdrive/c rm -rf Windows.old

Saturday, July 29, 2017

Ubuntu Tool for Creating Bootable USB

I have tried all kinds of variation of the dd command with some suggestions to format with fat and some ntfs, use isohybrid then ensure to sync at the end of the dd process; yet none of those techniques work.

The other day, I was referred to a product called mkusb and it worked beautifully.  It had different options to create Linux or Windows images.  I guess there are different implementations of the bootable usb when it comes to the different OSes.  Also this product is implementing dd commands in the background so obviously these developers knows the parameters and proper provisioning processes to make it work.  I will dig into their codes to figure out what I did wrong in the near future then repost of my findings.

Reference: https://help.ubuntu.com/community/mkusb

Here is the skinny version to install and launch

sudo add-apt-repository universe # only for standard Ubuntu

sudo add-apt-repository ppa:mkusb/ppa
sudo apt-get update
sudo apt-get install mkusb mkusb-nox usb-pack-efi

mkusb