2017-12-10



Paul Graham's Quote

  • You need three things to create a successful startup: to start with good people, to make something customers actually want, and to spend as little money as possible.
성공적인 스타트업 만드는 데 필요한 3가지: 좋은 사람들과 시작하기, 고객들이 실제로 원하는 건 만들기, 돈은 적게 쓰기
  • A startup is like a mosquito. A bear can absorb a hit and a crab is armored against one, but a mosquito is designed for one thing : to score. No energy is wasted on defense. The defense of mosquitos, as a species, is that there are a lot of them, but this is little consolation to the individual mosquito.
스타트업은 모기와 같다. 곰은 충격을 흡수하고 게는 껍질이 있지만 모기는 한 가지를 위해 만들어졌다. 득점(성공?승리?)하기. 방어에 허비되는 에너지는 없다. 하나의 종으로서 모기의 방어는 수가 많다는 점이지만 개별 모기에게는 아무런 위로가 되지 않는다.
  • Don't ignore your dreams; don't work too much; say what you think; cultivate friendships; be happy.
꿈을 무시하지 마라: 너무 많이 일하지 마라; 생각하는 걸 얘기해라; 우정을 쌓아라; 행복해라
  • I'm not saying there's no such thing as genius. But if you're trying to choose between two theories and one gives you an excuse for being lazy, the other one is probably right.
천재성이라는 게 없다고 말하지는 않겠다. 그런데 두 가지 이론 중에서 선택하고자 할 때, 그 중 한가지가 당신이 게을러져도 괜찮다고 한다면, 아마도 다른 것이 맞다.
  • The most important thing is not to let fundraising get you down. Startups live or die on morale. If you let the difficulty of raising money destroy your morale, it will become a self-fulfilling prophecy.
가장 중요한 것은 자금조달이 당신을 실망시키게 두지 마라. 스타트업은 사기에 죽고 산다. 만약 자금조달의 어려움이 의욕을 꺾는다면, 그것은 자기성취예언(다른 상황들도 악화시키게 됨)이 될 것이다.

2017-11-10

How to use different gateways by IP address on Linux and Mac

on Linux
$ sudo route add -net 192.168.0.0 netmask 255.255.0.0 gw 192.168.0.1
$ sudo route add -net 192.168.1.0 netmask 255.255.0.0 gw 192.168.0.254
$ sudo route add -net 192.168.0.0 netmask 255.255.0.0 dev eth0
$ sudo route add -net 192.168.1.0 netmask 255.255.0.0 dev eth1
$ sudo route add -net 192.168.0.0/16 gw 192.168.0.1
$ sudo route add -net 192.168.1.0/16 gw 192.168.0.254
$ sudo route add -net 192.168.0.0/16 dev eth0
$ sudo route add -net 192.168.1.0/16 dev eth1
on Mac
$ sudo route -n add -net 192.168.0.0 -netmask 255.255.0.0 192.168.0.1
$ sudo route -n add -net 192.168.1.0 -netmask 255.255.0.0 192.168.0.254
$ sudo route -n add -net 192.168.0.0/16 192.168.0.1
$ sudo route -n add -net 192.168.1.0/16 192.168.0.254

2017-11-09

[MAC] convert png to icns with terminal

Use sips to convert png to icns
$ sips
sips 10.4.4 - scriptable image processing system.

This tool is used to query or modify raster image files and ColorSync ICC profiles.
Its functionality can also be used through the "Image Events" AppleScript suite.
Try 'sips --help' or 'sips --helpProperties' for help using this tool

Command
$ sips -s format icns input.png --out output.icns

[MAC] SOMETHING ON DEVICES can't be opened because the original item can't be found

"Cloud" can't be opened because the original item can't be found. Cloud is not an external device.



Follow 3 steps to resolve.
1. Relaunch Finder

  • Option+Right click -> Select Relaunch
  • or open terminal then run "killall Finder"


2. Force quit Finder

  • Apple menu -> Force Quit Finder -> Select Finder -> Click Relaunch
  • or Option+Command+Escape -> Select Finder -> Click Relaunch

3. Logout then Login again. Now no more Cloud.

2017-09-17

Difference among tall tales, legends, myths, fairy tales and fables

TALL TALEs humorously exaggerate facts with plausible story lines.

LEGENDs embellish and glorify historical facts or fiction passed down through generations.

MYTHs explain natural phenomena with supernatural beings or creators based on religion.

FAIRY TALEs usually conflict between clearly separated good and evil with magic and imaginary creatures.

FABLEs teach a moral with talking animals.

2017-02-12

Hide files and folders with an Encrypted disk image on Mac

1. Create an Encrypted disk image file
$ hdiutil create -size 100g -layout GPTSPUD -fs "Journaled HFS+" -volname "Secured Repository" -type SPARSEBUNDLE -encryption AES-128 EncDiskImg

Enter a new password to secure "EncryptedDisk.sparsebundle": **********
Re-enter new password: **********

created: /Users/dee/Cloud/EncDiskImg.sparsebundle
* For more options
 $ hdiutil create -help
or use Disk Utility


2. Mount
Double click the image on Finder then answer password you enter the above.
or
$ hdiutil mount EncDiskImg.sparsebundle

Enter password to access "EncDiskImg.sparsebundle": **********

/dev/disk2           GUID_partition_scheme
/dev/disk2s1        EFI
/dev/disk2s2        Apple_HFS /Volumes/Secured Repository

3. Also you can hide the file not to show on Finder. Note that the "ls" command on Terminal can list the file.
$ setfile -a V EncDiskImg.sparsebundle
* Use applescript to mount a hidden file.
do shell script "hdiutil mount /PATH/TO/EndDiskImg.sparsebundle"
It will show you a prompt to get password.

2017-01-31

How to set DYLD_LIBRARY_PATH in Xcode

Open Product > Scheme > Edit Scheme > Run > Arguments
Add DYLD_LIBRARY_PATH under Environment Variables


dyld: Library not loaded: libboost_system.dylib

Got an error when compile boost library and run a sample program.
$ ./PROGRAM
dyld: Library not loaded: libboost_system.dylib
  Referenced from: /PATH/TO/PROGRAM
  Reason: image not found
Abort trap: 6
Check libraries then tried install_name_tool, but failed.
$ otool -L PROGRAM
PROGRAM:
libboost_system.dylib (compatibility version 0.0.0, current version 0.0.0)
libboost_thread.dylib (compatibility version 0.0.0, current version 0.0.0)
/usr/lib/libc++.1.dylib (compatibility version 1.0.0, current version 307.4.0)
/usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 1238.0.0)

$ install_name_tool -change libboost_system.dylib /PATH/TO/libboost_system.dylib PROGRAM
$ install_name_tool -change libboost_thread.dylib /PATH/TO/libboost_thread.dylib PROGRAM
Solution, add the following to .profile or .bash_profile
export DYLD_LIBRARY_PATH=/PATH/TO/BOOST/stage/lib

2017-01-11

Install Spark on Ubuntu

# Install Java

$ sudo apt install python-software-properties
$ sudo add-apt-repository ppa:webupd&team/java
$ sudo apt update
$ sudo apt install oracle-java8-installer

# Install Scala
$ sudo apt install scala
# Install Spark
Download prebuilt Spark from http://spark.apache.org/downloads.html
$ wget http://d3kbcqa49mib13.cloudfront.net/spark-2.1.0-bin-hadoop2.7.tgz
$ sudo mv spark-2.1.0-bin-hadoop2.7.tgz /opt
$ sudo tar xfz spark-2.1.9-bin-hadoop2.7.tgz
$ sudo ln -s spark-2.1.9-bin-hadooop2.7 spark
# Add environment variables and add path
$ vi ~/.profile
JAVA_HOME=/usr/lib/jvm/java-8-oracle
SCALA_HOME=/usr/share/scala
SPARK_HOME=/opt/spark
PYTHONPATH=$SPARK_HOME/python/lib/pyspark.zip:$SPARK_HOME/python/lib/py4j-0.10-4-src.zip
export JAVA_HOME SPARK_HOME PYTHONPATH
PATH="$HOME/bin:$HOME/.local/bin:$PATH:$SPARK_HOME/bin"
# Edit configuration(Optioinal) to reduce logs displayed
$ cd $SPARK_HOME/conf
$ sudo cp log4j.properties.template log4j.properties
$ sudo vi log4j.properties
FIND : log4j.rootCategory=INFO, console
REPLACE : log4j.rootCategory=WARN, console
# Test if Spark works
$ run-example SparkPi 10
Pi is roughly 3.140963140963141
# Launch Spark Shell(run scala language) or PySpark(run python language)
$ spark-shell
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.1.0
      /_/
        
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_111)
Type in expressions to have them evaluated.
Type :help for more information.
scala >
$ pyspark
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /__ / .__/\_,_/_/ /_/\_\   version 2.1.0
      /_/
Using Python version 2.7.12 (default, Nov 19 2016 06:48:10)
SparkSession available as 'spark'.
 > > > 
$ Spark UI
http://localhost:4040

Install Spark on Windows 10

# Download
JAVA from http://www.oracle.com/technetwork/java/javase/downloads/index.html
SCALA from http://www.scala-lang.org/download/
Prebuilt Spark from http://spark.apache.org/downloads.html
WINUTILS from https://github.com/steveloughran/winutils
# Installation
Install Java and Scala
Unzip prebuilt Spark into C:\Usr
Unzip Winutils(hadoop 2.7.1/bin) into C:\Usr\spark-2.10-bin-hadoop2.7\bin
# Set environment variables
JAVA_HOME=C:\Program Files\Java\jdk1.8.0_111
_JAVA_OPTIONS=-Xmx512M -Xms512M
SCALA_HOME=C:\Program Files\scala
SPARK_HOME=C:\Usr\spark-2.1.0-bin-hadoop2.7
HADOOP_HOME=C:\Usr\spark-2.1.0-bin-hadoop2.7
# Add path
%SCALA_HOME%\bin
%SPARK_HOME%\bin
# Grant permissions
winutils.exe chmod 777 C:\tmp\hive
# Run command
> scala -version
Picked up _JAVA_OPTIONS: -Xmx512M -Xms512M
Scala code runner version 2.12.1 -- Copyright 2002-2016, LAMP/EPFL and Lightbend, Inc.
> spark-shell 
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.1.0
      /_/
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_111)
Type in expressions to have them evaluated.
Type :help for more information.
scala>