Kazinoların strateji planlaşdırılması və uğurlu tətbiq metodları

Yerli və beynəlxalq bazarlarda müştəri cəlbi məsələsi hər zaman aktuallığını qoruyur. Müasir dövrdə kazino menecmentinin aparıcı mütəxəssisləri, analitik alətlərdən istifadə edərək, bazar dinamikasını öyrənməli və müvafiq strategiyalar inkişaf etdirməlidirlər. Gəlir təhlili, resurs idarəsi və effektiv bazarlama prinsipləri, kazino operatorlarının uğurlu fəaliyyətini təmin edən əsas amillərdir.

Bu sektorda uğur qazanmaq üçün oyun müəssisələrinin strategiya inkişafı prosesində çevik yanaşmalar tətbiq etməsi vacibdir. Çünki müştəri ehtiyaclarının dəyişməsi, bazar nəzəriyyələrinin yenilənməsinə səbəb olur. Bu səbəbdən, analitik alətlər vasitəsilə toplanan məlumatlar, menecerlərə müvəffəqiyyət göstəricilərini izləmək və optimallaşdırmaq imkanı tanıyır.

Və əgər siz uyğun strategiyaları necə tətbiq edəcəyinizi öyrənmək istəyirsinizsə, o zaman betandreas azerbaijan sayəsində daha ətraflı məlumat əldə edə bilərsiniz. Unutmayın ki, cəlbedici və müasir yanaşma sayəsində, müştəri bazasında nəzərəçarpan artım əldə etmək mümkündür.

Müştəri Cəlb Etmə Metodları

Müştəri aşkarlanması kazinolar üçün kritik bir məsələdir. Bu, müştəri davranışlarını başa düşmək və onların tələblərini qarşılamaq üçün analitik alətlərin istifadəsini tələb edir. Müxtəlif məlumatlar toplayaraq müştərilərin gözləntilərini izləmək, onlarla daha effektiv əlaqə qurmağı mümkün edir.

Gəlir təhlili, müştəri cəlbi planlarının formalaşmasında əhəmiyyətli rol oynayır. Bu analiz müştəri bazasının davranışlarını anlamağa kömək edərək, effektiv resurs idarəsi üçün vacibdir. Həmçinin, inkişaf etmiş strategiya inkişafı ilə müştəri bazasının genişləndirilməsi nail olmalarına manevr edilməkdədir.

Müştəri davranışlarının müvafiq göstəriciləri, kazinonun menecmenti üçün istifadə ediləcəyi zaman, gəlir axınını optimallaşdırmağa imkan yaradır. Bu, bazar dinamikasının dəyərləndirilməsini asanlaşdırır və müştəri məmnuniyyətini artırır.

Analitik alətlərin tətbiqi yalnız cəlb etmə taktikalarını deyil, həm də onların uğur göstəricilərini formalaşdırır. Bu göstəricilər, müştəri cəlbinin effektivliyini ölçmək və inkişaf etməyə imkan tanıyan əhəmiyyətli məlumatlar təqdim edir.

Müştəri cəlb etmə metodlarının bir çoxu, kazinonun unikal təkliflərini bazara çıxarmağa yönəlib. Geniş müştəri bazası yaradarkən, bu metodların sistemli tətbiqi müştərilərin geri dönmə ehtimalını artırır.

Bazar dinamikalarının hər zaman dəyişməsi, menecerləri yeni strategiyaları sınamağa sövq edir. Beləliklə, düzgün resurs idarəsi ilə birlikdə, müştəri ilə uzunmüddətli əlaqələr qurmaq mümkündür.

Nəticədə, müştəri cəlb etmə metodları kazino üçün kritik bir element olaraq qalır. Bu metodların mütəmadi olaraq yenilənməsi, müştəri bazasının inkişafı və biznesin genişləndirilməsi üçün əhəmiyyətlidir.

Rəqabət Üstünlüyü Yaratma Yolları

Rəqabət mühitində uğur qazanmaq üçün bazar dinamikası yaxşı başa düşülməlidir. Hər bir müəssisə müştəri davranışlarını və bazar tendensiyalarını izləyərək öz strategiyalarını qurmalıdır. Məlumatlardan istifadə edərək, bazar dəyişikliklərinə uyğunlaşmaq müştəri cəlbinin artırılması üçün əhəmiyyətlidir.

Effektiv bazarlama kampaniyaları müştəri cəlbini artırmaqda böyük rol oynayır. Müştəriləri cəlb etmək məqsədilə yenilikçi yanaşmalara ehtiyac var. Marketinqdə analitik alətlərdən istifadə edərək, müştəri seqmentlərini təhlil etmək və onların ehtiyaclarına uyğun təkliflər təqdim etmək mümkündür.

Resurs idarəsi bir müəssisənin dayanıqlığını artırır. Düzgün resursların təyini və səmərəli istifadəsi sayəsində daha yaxşı nəticələr əldə etmək mümkündür. İş prosesinin optimallaşdırılması, kazinoların iş qrafiklərini daha səmərəli tənzimləməsi ilə nəticələnir.

  • Segmentasiyalı bazar analizi müştəri davranışlarını anlamaqda kömək edir.
  • Gəlir təhlili müəssisənin maliyyə vəziyyətini düzgün qiymətləndirmək üçün vacibdir.
  • Müştəri məmnuniyyəti göstəriciləri xidmət keyfiyyətinin artırılması üçün əhəmiyyətlidir.

Kazino menecmenti müasir yanaşmalara əsaslanaraq daha dinamik strategiyalar inkişaf etdirməlidir. İdarəetmə təcrübələrinin müasir texnologiyalarla birləşməsi, daha səmərəli qərar qəbuletmə proseslərinə zəmin yaradır.

Uğur göstərən müəssisələr, bazar sahəsindəki dəyişiklikləri qabaqlayaraq müştəri tələblərinə cavab verə bilirlər. Bu tip müəssisələr müştəri münasibətləri inkişaf etdirərək, uzunmüddətli müvəffəqiyyət üçün düzgün addımlar atırlar.

Maliyyə idarəetməsi və risk analizi

Maliyyə idarəetməsi kazinoların fəaliyyətində müvəffəqiyyət göstəricilərinin təyin edilməsi və izlənilməsində mühüm rol oynayır. Burada əsas məqsəd, resurs idarəsi vasitəsilə gəlir təhlili aparmaq və bazar dinamikasını nəzərə alaraq müştəri davranışlarını proqnozlaşdırmaqdır. Bu yanaşma, kazino menecmentinə effektiv bazarlama strategiyalarını inkişaf etdirməyə imkan tanıyır, eyni zamanda risklərin minimallaşdırılmasını təmin edir.

Risk analizi, potensial maliyyə itkilərini öncədən müəyyən etmək üçün analitik alətlərdən istifadə etməyi tələb edir. Hər bir kazino operatoru, bazardakı dəyişiklikləri izləyərək, müştəri tələblərinə uyğunlaşmalı və strategiyalarını bununla formalaşdırmalıdır. Beləliklə, müvafiq risklərin idarə olunması, həm müştərilərin məmnuniyyətini, həm də gəlirliliyi artırmaq üçün önəmlidir.

Nəticədə, maliyyə idarəetməsi və risk analizi ilə bağlı qoyulan yanaşmalar, kazino fəaliyyətini optimallaşdırmaqda vacibdir. Gəlir təhlili və bazar dinamikalarının dərin anlayışı, müvəffəqiyyətli bir mühitin yaradılmasında əsas rol oynayır. İradələr və meyilləri düzgün qiymətləndirmək, kazino menecmentinin uzunmüddətli strategiyalarını daha dayanıqlı edir.

Marketinq strategiyalarının tətbiqi

Marketinq strategiyalarının təsirli tətbiqi, mənbə idarəçiliyinin düzgün yanaşmasını tələb edir. Bu yanaşma, müştəri cəlbində yeni metodların inkişafına kömək edir. Analitik alətlərdən istifadə edərək müştəri davranışlarını daha dərindən anlamaq mümkündür. Bu, bazar dinamikalarını düzgün şəkildə təhlil etməyə imkan tanıyır.

Kazino menecmenti, gəlir təhlili ilə başlamaqla, resursların səmərəli istifadəsini təmin edə bilər. Müxtəlif analitik metodlardan istifadə həm müştəri məmnuniyyətini artırır, həm də gəlirliliyi yüksəldir. Bu, strateji inkişafı istiqamətində atılan mütəmadi addımların cilalanmasında faydalıdır.

Müştəri davranışlarının öyrənilməsi, müştəri cəlbi baxımından əhəmiyyətli bir faktordur. Bazar dinamikalarını izləmək, müştərilərlə daha yaxın əlaqələr qurmaq üçün ideal bir yol təqdim edir. Beləliklə, müştəri məmnuniyyətini artırmaq üçün əhəmiyyətli bir platforma yaranır.

Gəlir təhlili, müştəri bazasını genişləndirmək üçün rəqəmli marketing strategiyalarını tətbiq etmək üçün bir əsasdır. Eyni zamanda, resurs idarəçiliyi, gəlirlərdən maksimum fayda əldə etməyə kömək edir. Bu, müəssisənin uzunmüddətli inkişafını təmin edən bir yanaşmadır.

Marketinq Strategiyaları Təsiri
Müştəri Təhlili Müştəri məmnuniyyətinin artırılması
Bazar Araşdırmaları Bazar dinamikalarının başa düşülməsi
Rəqəmsal Reklam Gəlir artırılması
İctimai Medianın İstifadəsi Müştəri cəlbinin genişləndirilməsi

Müştəri davranışlarını başa düşmək, marketinq strategiyalarının müvəffəqiyyət göstəriciləri ilə sıx bağlıdır. Analitik alətlər, bu davranışların təhlilini asanlaşdırır, beləliklə müştəri ilə əlaqələrin daha da güclənməsi mümkün olur. Yalnız buna diqqət yetirmək, müştəri cəlbini artırır.

Kazino menecmentinin strategiya inkişafı, bazar dinamikalarına uyğunlaşma ilə nəticələnir. Müştəri ehtiyaclarını anlamaq, təklif olunan xidmətlərin keyfiyyətini artırır. Bu, müştəri etimadını artırır və onun loyallığını təmin edir.

Marketinq strategiyalarının tətbiqi, günümüzdə müvəffəqiyyətli olmaq üçün zəruridir. Yalnız müştərilərin davranışlarına cavab verən yanaşmalarla uğur qazanmaq mümkündür. Resursların düzgün idarə edilməsi və gəlir təhlili, uzunmüddətli nailiyyətə gətirib çıxarır.

Suallar-cavab:

Kazinoların strateji planlaşdırılmasının önəmi nədir?

Kazinoların strateji planlaşdırılması, onların bazar tələblərinə uyğunlaşmasına və müvəffəqiyyətli işləməsinə kömək edir. Bu planlaşdırma prosesi, maliyyə idarəçiliyi, müştəri təcrübəsinin artırılması və resursların optimal istifadəsi kimi aspektləri əhatə edir. Beləliklə, kazinoların gələcəkdəki mövqelərini gücləndirmələri üçün strateji məqsədlər müəyyən ediləcək.

Kazinolar strateji planlaşdırma prosesini necə həyata keçirir?

Kazinolar, strateji planlaşdırma prosesini bir neçə mərhələdə həyata keçirir. İlk öncə, dərin bazar təhlili aparılır, müştəri tələbləri və rəqabət mühiti araşdırılır. Daha sonra, müvafiq hədəflər müəyyən edilir və bu hədəflərə çatmaq üçün tədbirlər planı hazırlanır. Bu plan, resursların bölüşdürülməsi, marketinq strategiyaları və operativ prosedurların optimallaşdırılmasını əhatə edir.

Kazinolar müştəri rəylərini strateji planlaşdırmada necə istifadə edə bilər?

Müştəri rəyləri, kazinolar üçün dəyərli bir bilgi mənbəyidir. Bu rəylər, müştəri məmnuniyyətinin artırılması və xidmətlərin inkişaf etdirilməsi üçün istifadə olunur. Rəylərə əsasən, kazinolar öz strategiyalarını tənzimləyə bilər, müştəri təcrübəsini yaxşılaşdıran innovativ təşəbbüslər tətbiq edə bilər, beləliklə müştəri sadiqliyini artırır.

Kazinolar üçün strateji planlaşdırmanın uzunmüddətli faydaları nələrdir?

Uzunmüddətli strateji planlaşdırma, kazinoların bazarda möhkəm duruş əldə etməsinə, maliyyə stabilliyinin artırılmasına və müştəri bazasının genişlənməsinə kömək edir. Bu yanaşma, kazinoların müştəri tələblərinə daha yaxşı cavab verməsinə imkan tanıyır və onlara rəqabət üstünlüyü təmin edir.

Strategie efficaci per potenziare le abilità nel poker

Introduzione al Poker: Comprendere le Basi e le Regole Fondamentali

Il poker è un gioco affascinante che combina abilità e strategia. Per iniziare, è fondamentale comprendere le regole di base, come il valore delle mani e come si gioca a ciascuna variante, che può variare dal Texas Hold’em al Omaha. Iniziare a praticare, sia online che fisicamente, è essenziale per acquisire fiducia e migliorare le proprie capacità di gioco.

Un aspetto cruciale nel poker è la lettura degli avversari. Riuscire a interpretare le loro reazioni e il loro comportamento offre vantaggi strategici decisivi. L’analisi della partita consente di individuare i punti deboli degli avversari, mentre il feedback ricevuto dagli altri giocatori è fondamentale per il proprio miglioramento.

Partecipare a tornei è un’occasione imperdibile per mettere in pratica le proprie strategie. Durante questi eventi, è possibile mettere alla prova le tecniche apprese e approfondire il proprio studio del gioco. Inoltre, gli esercizi mirati possono offrire un altro ottimo metodo per affinare le abilità e accumulare esperienza, mentre il feedback ricevuto https://eplay24s.it/ contribuirà a migliorare ulteriormente le proprie performance.

Infine, è importante sfruttare le risorse disponibili, come libri, articoli e video, per ampliare le proprie conoscenze. Networking con altri appassionati del gioco può anche offrire opportunità preziose per apprendere e condividere strategie vincenti.

L’importanza della Pratica e del Feedback per Migliorare le Abilità

La pratica è fondamentale per affinare qualsiasi abilità, sia essa sportiva o intellettuale. Attraverso l’esercizio costante, un giocatore riesce a interiorizzare le tecniche necessarie, come strategie di gioco e lettura degli avversari. Inoltre, la partecipazione a tornei e competizioni offre un contesto ideale per mettere in pratica quanto appreso.

Un altro aspetto essenziale è il feedback. Ricevere commenti da allenatori e compagni di squadra consente di correggere gli errori e migliorare continuamente. Questo processo di analisi partita è cruciale: registrare e rivedere le proprie performance aiuta a individuare aree di miglioramento.

Studiare risorse specifiche, come video o articoli, e discutere di strategie con altri praticanti tramite il networking può fornire ulteriori spunti utili. La combinazione tra pratica, feedback e apprendimento continuo crea un circolo virtuoso che porta a un reale progresso nelle proprie abilità.

Tecniche di Lettura degli Avversari: Analisi Comportamentale e Strategica

La lettura degli avversari è un’abilità fondamentale per chi partecipa a tornei o competizioni. Comprendere il comportamento degli avversari non solo aiuta a prevedere le loro mosse, ma offre anche l’opportunità di sviluppare strategie più efficaci. Una delle tecniche più pratiche consiste nell’analizzare le reazioni degli avversari durante la partita. Osservando i loro schemi di gioco, è possibile adattare le proprie strategie in tempo reale.

Ad esempio, gli esercizi di feedback, sia individuali che in gruppo, possono migliorare notevolmente le capacità analitiche. Studiare le partite precedenti, con particolare attenzione a come gli avversari hanno reagito in situazioni specifiche, offre spunti preziosi. Risorse online e comunità di networking, come forum e gruppi di discussione, possono arricchire ulteriormente questo processo di apprendimento.

Inoltre, la partecipazione attiva a tornei può rivelarsi vitale per affinare queste tecniche, poiché mette i giocatori di fronte a diversi stili e approcci. Imparare a leggere i segnali non verbali, come il linguaggio del corpo, offre un vantaggio strategico, rendendo ogni partita un’opportunità di crescita personale e professionale.

Strategie Avanzate: Analisi Partita e Pianificazione per il Successo nei Tornei

Per avere successo nei tornei, è fondamentale adottare strategie avanzate che includano un’analisi approfondita delle partite. Questa analisi non riguarda solo la comprensione delle proprie performance, ma anche la lettura degli avversari. Attraverso esercizi di pratica e feedback mirato, gli atleti possono affinare le loro abilità.

Un elemento chiave è lo studio delle risorse disponibili, come video di partite e statistiche, che possono offrire intuizioni preziose. Inoltre, la partecipazione a tornei di diverse categorie consente di vivere esperienze variate, favorendo il networking con altri giocatori e allenatori.

Infine, creare un piano dettagliato per ogni torneo è cruciale. Questo piano dovrebbe includere obiettivi specifici e strategie basate sull’analisi partita, per garantire una preparazione ottimale. L’approccio sistematico non solo migliora le performance, ma aumenta anche la fiducia in sé stessi, rendendo ogni competizione un’opportunità di crescita.

Risorse e Networking: Come Sfruttare Comunità e Materiale Didattico per Crescere nel Poker

Per diventare un giocatore di poker di successo, è essenziale integrare pratica e feedback. Partecipare a tornei locali non solo offre l’occasione di testare le proprie strategie, ma permette anche di ricevere input prezioso. Ogni mano giocata è un’opportunità di analisi partita.

Leggere i propri avversari è un’arte; osservare i loro comportamenti può rivelarsi cruciale per sviluppare strategie vincenti. Per questo, investire tempo nello studio delle risorse disponibili, come video didattici e manuali, è fondamentale. Esercizi pratici, come il poker online, rinforzano le tecniche apprese.

Networking con altri appassionati attraverso forum e gruppi social offre un ulteriore vantaggio. Condividere esperienze e apprendere dagli altri genera un ambiente di crescita. Infine, la partecipazione a workshop e seminari rappresenta un’occasione unica per approfondire le proprie competenze.

Modalità di Pagamento su Bankonbet: Velocità e Sicurezza Garantita

Introduzione alle Modalità di Pagamento su Bankonbet

In un mondo sempre più digitalizzato, la scelta delle modalità di pagamento gioca un ruolo cruciale per gli utenti di piattaforme come Bankonbet. La sicurezza delle transazioni è una priorità, tanto che l’uso di carte di credito e portafogli elettronici offre un livello di protezione dati elevato, garantendo la tranquillità degli utenti durante i propri affari online.

Bankonbet si distingue per la varietà delle opzioni di pagamento, dalle tradizionali modalità come i bonifici bancari a moderne soluzioni come le criptovalute. Queste ultime offrono non solo una gestione fondi rapida, ma anche la possibilità di effettuare transazioni anonime, adatte a chi cerca maggiore riservatezza.

La velocità di prelievo è un altro elemento fondamentale che Bankonbet garantisce. I metodi più innovativi, come i portafogli elettronici, consentono di ricevere fondi in tempi record, mentre le opzioni più classiche, sebbene richiedano più tempo, presentano Bankonbet e vantaggi a lungo termine.

Infine, l’esperienza utente è arricchita da una gestione semplice e intuitiva delle opzioni di pagamento. Un’interfaccia user-friendly rende più facile per gli utenti selezionare il metodo preferito, garantendo al contempo una protezione dei dati efficace e continua.

Vantaggi delle Transazioni Sicure

Le transazioni sicure rappresentano un pilastro fondamentale per la sostenibilità delle attività commerciali moderne. Utilizzando carte di credito, portafogli elettronici e persino criptovalute, i consumatori possono godere della comodità di operazioni rapide e protette. Questi metodi offrono non solo una velocità di prelievo superiori, ma anche una gestione fondi più efficiente.

Un altro vantaggio delle transazioni sicure è la protezione dati. Le tecnologie di crittografia avanzata garantiscono che le informazioni personali siano al sicuro da potenziali attacchi informatici, offrendo così un’esperienza di acquisto più tranquilla.

Inoltre, molte opzioni di pagamento offrono costi trasparenti, permettendo ai clienti di sapere esattamente quali spese sono incluse. Questo livello di chiarezza è cruciale, specialmente quando si effettuano bonifici bancari.

Con l’aumento della dipendenza dal commercio elettronico, la fiducia in transazioni sicure è ora più che mai essenziale, creando un ambiente favorevole per le aziende che desiderano prosperare.

Opzioni di Pagamento Disponibili: Carte di Credito, Portafogli Elettronici e Criptovalute

Oggi, gli utenti hanno a disposizione diverse opzioni di pagamento, rendendo le transizioni più fluide e sicure. Le carte di credito sono la scelta più diffusa, grazie alla loro accettazione quasi universale e alla protezione dei dati offerta dai provider. Questo tipo di pagamento consente transazioni sicure, garantendo ai clienti una certa tranquillità.

I portafogli elettronici stanno rapidamente guadagnando popolarità per la loro praticità e velocità di prelievo. Attraverso piattaforme come PayPal o Apple Pay, gli utenti possono gestire i fondi in modo semplice, evitando costi nascosti e godendo di costi trasparenti. Inoltre, la protezione dei dati è sempre una priorità, fornendo un ulteriore livello di sicurezza.

Infine, le criptovalute rappresentano un’opzione innovativa e moderna, attirando l’attenzione di molti investitori. Questa modalità offre vantaggi unici, come bassi costi di transazione e la possibilità di realizzare bonifici bancari in tempo reale. Con la crescente accettazione delle criptovalute, è fondamentale informarsi e valutare l’efficacia di queste nuove alternative.

Velocità di Prelievo e Costi Trasparenti

Quando si parla di gestione dei fondi online, la velocità di prelievo è un fattore cruciale. Gli utenti spesso desiderano accedere rapidamente alle loro vincite, e questa necessità ha spinto molte piattaforme a ottimizzare i propri sistemi. Le transazioni sicure, effettuate tramite carte di credito, portafogli elettronici e criptovalute, garantiscono che i prelievi avvengano in modo fluido e protetto.

I bonifici bancari rimangono una scelta popolare, ma possono comportare dei ritardi. Per una gestione dei fondi efficiente, è fondamentale che le piattaforme comunichino chiaramente i costi trasparenti associati a ogni metodo di prelievo. Questo permette agli utenti di scegliere l’opzione più adatta alle proprie esigenze.

Ogni transazione deve rispettare i più elevati standard di protezione dati, per assicurare che le informazioni personali degli utenti siano sempre al sicuro. In questo contesto, la trasparenza diventa un elemento fondamentale per costruire fiducia tra gli utenti e i fornitori di servizi.

Infine, avere a disposizione diverse opzioni di pagamento non solo migliora l’esperienza utente, ma favorisce anche una maggiore soddisfazione generale. Investire in sistemi efficienti di prelievo e comunicazione chiara sui costi può rivelarsi vantaggioso per tutte le parti coinvolte.

Sicurezza e Protezione dei Dati nella Gestione dei Fondi

Nel mondo delle finanze digitali, la protezione dei dati è fondamentale. Le transazioni sicure mediante carte di credito e portafogli elettronici non solo garantiscono la privacy, ma anche la tranquillità degli utenti.

Utilizzare criptovalute e bonifici bancari offre alternative valide, caratterizzate da costi trasparenti e un’ottima velocità di prelievo. La scelta dell’opzione di pagamento giusta aumenta la sicurezza senza compromettere l’efficienza.

Garanzie di sicurezza come l’autenticazione a due fattori e sistemi di crittografia proteggono i vostri fondi. Adottare misure di sicurezza aiuta a prevenire frodi e a mantenere la fiducia nel processo di gestione fondi.

Recover gigs of storage with this simple hack

I recovered 30 gigs of storage on my MBP (MacBook Pro) with this simple hack of converting my PNGs to JPEGs. Here’s how.

If you own a Mac or an iPhone, you’ll notice both devices save “screenshots” with a PNG extension. Portable Network Graphics (.png), an image file format is uses lossless compression to save your images. You maybe thinking this lossless compression is the best format but the reality is JPEG, a lossy compression, is just as good at retaining image quality at a MASSIVE fraction of the size of a PNG.

Image sizes really matter when your device runs 1080p+ resolutions (most modern Macs) or new OLED iPhones. A full size (no crop) screenshot on my MBP with a resolution of 2080×1080 yields a 7MB (7168kb) image. A JPEG equivalent is 500kb. That’s ~14x smaller. If you are like me and take screenshots as reminders or todos (GTD baby!) then you’ll be chewing through storage fast.

100 of these PNGs and you’ll be reaching the 1Gb territory.

With storage being dirt cheap who cares right? Not so. Laptops with external drives are annoying and iPhones do not have extension cards. Furthermore, data transfer is a burden consider upload speeds are always conveniently ignored yet important if you are backing up to the cloud. And let’s face it, who’s got time to sit around waiting when the same lot of photos with identical quality (assuming you aren’t blowing them up on a wall) can be backed up to your Dropbox cloud storage 14x faster. GTD baby!

Did I mention this will also speed up your Spotlight searching, indexing, extend your SSD life and open those images faster.

Convert PNGs to JPEGS

PNG are uncompressed images from things mainly like Screenshots on your Mac or iPhone. They don’t need to be PNG unless you really are picky about the quality of text sharpness. ie. Under JPEG text becomes a tad more blurry since compression reuses surrounding pixels to make image smaller.

Overall, the chance of you retaining PNGs is low unless you do a lot of photo editing and need that pixel level detail especially for font/text clarity.

[1] Identify Opportunities

Run a scan to identify where the opportunities (PNGs) are located on your drive.

$ find . -type f -iname '*.png' | wc -l

find . -type f finds all files ( -type f ) in this ( . ) directory and in all sub directories, the filenames are then printed to standard out one per line.

This is then piped | into wc (word count) the -l option tells wc to only count lines of its input.

If you only want files directly under this directory and not to search recursively through subdirectories, you could add the -maxdepth flag:

$ find some_directory -maxdepth 1 -type f | wc -l

The key to that case-insensitive search is the use of the -iname option, which is only one character different from the -name option. The -iname option is what makes the search case-insensitive.

[2] Convert

Create JEPG versions of the PNGS and remove old PNGs. There is no need to keep the old PNGs. They are the ones that take up all the space.

Run this on a small subset of your PNGs to make sure you are happy with the resulting JPEG.

$ mogrify -format jpg *.png && rm *.png
$ mogrify -format jpg *.PNG && rm *.PNG

or convert and keep the original PNG

$ mogrify -format jpg *.png

[3] Celebrate

How much space did you recover?

Who really owns your Bose QC35 headphones?

I was excited to finally get my hands on the new Bose QC35 II because noise simply annoys me more than the average bear. The beautiful world we live in today is very noisy, from cars to traffic lights to photocopiers to background chatter, and it’s something some of us have learnt to live with while others suffer from the disruption. I’m in the latter crew. Until that it, a $300 Bose QC35 II became my friends, even if it was for a short period of time.

During this short period of time they were amazing. The ANC (Active Noise Cancelling) was superb! I was so excited I started showcasing (marketing for Bose) to all my software engineers and entrepreneurs, who like me seek silence to do their focused work, how their lives will change. I also sold my wife on these as a solution for plane travel. Noise inside planes reaches 80db, the sound of a vacuum cleaner near your ears on a trip from San Francisco to Sydney has shown over time to damage ear drums.

That is until I upgraded to firmware 4.5.2.

Enter the Firmware

You see, the Bose QC35 II has a computer inside which uses the multiple microphones places strategically to listen to incoming noise and cancel out the sound waves. This is orchestrated by a small onboard computer (think Arduino, like those from kunkune.co.uk) running custom Bose software to run and manage the hardware. Hence ACN. Software has bugs. Even production versions. Thus is the nature (complexity) of the beast. And manufacturers will send updates over the internet to patch things up.

I have no idea why I installed the firmware update since the headphones were working flawlessly. I’m sure it was from habit; an expectation of better things to come from an update. Just like when I update my iPhone or MBP I get better performance and maybe few new bells and whistles (features).

Sound Quality Degradation

After the update, the noise cancelling quality of my QC35 II was degraded. I sat there in the library hearing the photocopier and background chatter. Something I could never hear before. WTF! I tried the 2 ACN noise levels (high and low) and both were indistinguishable.

There was something wrong with the v4.5.2 firmware update.

Source: Bose Update RUINS Noise Cancelling??? (TESTED) — https://www.youtube.com/watch?v=yyC9QStmzcA&feature=youtu.be

Whether intentional or not, one has to question whether Bose took the $9 an hour engineer outsource route (Boeing is famous for doing so with their 737 MAX MCAS) because something like this surely could not happen if they owned the whole release process and had QA. However the timing of these degrading version updates coincides with the more expensive Bose Noise Cancelling Headphones 700 release. Coincidence or not I’ll leave this to the conspiracy experts to debate.

Next Steps

  1. Downgrade downgrade your Bose QuietComfort 35 II from 4.5.2 to 3.1.8. Yes it’s a tad complex but unfortunately Bose doesn’t support this, nor do they even explain what each version contains, so do this at your own risk.
  2. Send it back to Bose for replacement/repairs; but good luck. The customers who did say the returned units were just as bad.
  3. Leave your views/complaints on the Bose Community website to hopefully make them acknowledge this and fix it for good. Go here: https://community.bose.com/t5/Around-On-Ear-Headphones/Bose-QC-35-ii-firmware-4-5-2/td-p/213820

So who really owns your Bose QC35 headphones?

Bose.

They are the puppet master here. Controlling at will the quality of the headphones you paid them handsomely for.

Commanding a premium for average quality sound gear with what used to be amazing ACN, then manipulating the quality of their ACN moat through ghost version updates to prop new cheaper build products (*cough* Bose 700) by degrading previous generation units.

If you own the QC35 please let me know how your experience has been so far.

Dockerizing a web app, using Docker Compose for orchestrating multi-container infrastructure (part 1 of 3)

 

This is a GUEST BLOG POST by Andrew Bakonski

Head of International Engineering @ Veryfi.com

Andrew heads up International Engineering efforts at Veryfi supporting Veryfi’s Hybrid Infrastructure on AWS & Azure making sure the lights stay on.

Andrew’s LinkedIn: https://www.linkedin.com/in/andrew-bakonski/

A couple months ago we decided to move Veryfi’s Python-based web app onto Microsoft Azure. The process was complicated and involved several stages. First I had to Dockerize the app, then move it into a Docker Swarm setup, and finally set up a CI/CD pipeline using Jenkins and BitBucket. Most of this was new to me, so the learning curve was steep. I had limited experience with Python and knew of Docker and Jenkins, but had yet to dive into the deep end. After completing the task, I thought I could share my research and process with the Veryfi community.

I’ve compiled a three-part series that will cover these topics:

  1. Dockerizing a web app, using Docker Compose for orchestrating multi-container infrastructure
  2. Deploying to Docker Swarm on Microsoft Azure
  3. CI/CD using BitBucket, Jenkins, Azure Container Registry

This is the first post in the series.

I won’t go into a full blown explanation of Docker – there are plenty of articles online that answer that question, and a good place to start is here. One brief (and incomplete) description is that Docker creates something similar to Virtual Machines, only that Docker containers run on the host machine’s OS, rather than on a VM. Each Docker container should ideally contain one service and an application can comprise of multiple containers. With this approach, individual containers (services) can be easily swapped out or scaled out, independently of others. For example, our main web app currently runs on 3 instances of the main Python app container, and they all speak to one single Redis container.

Dockerizing an app

Note: the example included in this section can be found in this GitHub repo: https://github.com/abakonski/docker-flask
The example here is a minimal, “Hello World” app.

Docker containers are defined by Docker images, which are essentially templates for the environment that a container will run in, as well as the service(s) that will be running within them. A Docker image is defined by a Dockerfile, which outlines what gets installed, how it’s configured etc. This file always first defines the base image that will be used.

Docker images comprise multiple layers. For example, our web app image is based on the “python:3.6” image (https://github.com/docker-library/python/blob/d3c5f47b788adb96e69477dadfb0baca1d97f764/3.6/jessie/Dockerfile). This Python image is based on several layers of images containing various Debian Jessie build dependencies, which are ultimately based on a standard Debian Jessie image. It’s also possible to base a Docker image on “scratch” – an empty image that is the very top-level base image of all other Docker images, which allows for a completely customizable image, from OS to the services and any other software.

In addition to defining the base image, the Dockerfile also defines things like:

  • Environment variables
  • Package/dependency install steps
  • Port configuration
  • Environment set up, including copying application code to the image and any required file system changes
  • A command to start the service that will run for the duration of the Docker container’s life

This is an example Dockerfile:

FROM python:3.6

# Set up environment variables
ENV NGINX_VERSION '1.10.3-1+deb9u1'

# Install dependencies
RUN apt-key adv --keyserver hkp://pgp.mit.edu:80 --recv-keys 573BFD6B3D8FBC641079A6ABABF5BD827BD9BF62 \
    && echo "deb http://httpredir.debian.org/debian/ stretch main contrib non-free" >> /etc/apt/sources.list \
    && echo "deb-src http://httpredir.debian.org/debian/ stretch main contrib non-free" >> /etc/apt/sources.list \
    && apt-get update -y \
    && apt-get install -y -t stretch openssl nginx-extras=${NGINX_VERSION} \
    && apt-get install -y nano supervisor \
    && rm -rf /var/lib/apt/lists/*


# Expose ports
EXPOSE 80

# Forward request and error logs to Docker log collector
RUN ln -sf /dev/stdout /var/log/nginx/access.log \
    && ln -sf /dev/stderr /var/log/nginx/error.log

# Make NGINX run on the foreground
RUN if ! grep --quiet "daemon off;" /etc/nginx/nginx.conf ; then echo "daemon off;" >> /etc/nginx/nginx.conf; fi;

# Remove default configuration from Nginx
RUN rm -f /etc/nginx/conf.d/default.conf \
    && rm -rf /etc/nginx/sites-available/* \
    && rm -rf /etc/nginx/sites-enabled/*

# Copy the modified Nginx conf
COPY /conf/nginx.conf /etc/nginx/conf.d/

# Custom Supervisord config
COPY /conf/supervisord.conf /etc/supervisor/conf.d/supervisord.conf

# COPY requirements.txt and RUN pip install BEFORE adding the rest of your code, this will cause Docker's caching mechanism
# to prevent re-installinig all of your dependencies when you change a line or two in your app
COPY /app/requirements.txt /home/docker/code/app/
RUN pip3 install -r /home/docker/code/app/requirements.txt

# Copy app code to image
COPY /app /app
WORKDIR /app

# Copy the base uWSGI ini file to enable default dynamic uwsgi process number
COPY /app/uwsgi.ini /etc/uwsgi/
RUN mkdir -p /var/log/uwsgi


CMD ["/usr/bin/supervisord"]

Here’s a cheat sheet of the commands used in the above example:

  • FROM – this appears at the top of all Dockerfiles and defines the image that this new Docker image will be based on. This could be a public image (see https://hub.docker.com/) or a local, custom image
  • ENV – this command sets environment variables that are available within the context of the Docker container
  • EXPOSE – this opens ports into the Docker container so traffic can be sent into them. These will still need to be listened to from within the container, (i.e. NginX could be configured to listen to port 80). Without this EXPOSE command, no traffic from outside the container will be able to get through on those ports
  • RUN – this command will run shell commands inside the container (when the image is being built)
  • COPY – this copies files from the host machine to the container
  • CMD – this is the command that will execute on container launch and will dictate the life of the container. If it’s a service, such as NginX, the container will continue to run for as long as NginX is up. If it’s a quick command (i.e. “echo ‘Hello world'”), then the container will stop running as soon as the command has executed and exited

The Docker image resulting from the above Dockerfile will be based on the Python 3.6 image and contain NginX and a copy of the app code. The Python dependencies are all listed in requirements.txt and are installed as part of the process. NginX, uWSGI and supervisord are all configured as part of this process as well.

This setup breaks the rule of thumb for the “ideal” way of using Docker, in that one container runs more than one service (i.e. NginX and uWSGI). It was a case-specific decision to keep things simple. Of course, there could be a separate container running just NginX and one running uWSGI, but for the time being, I’ve left the two in one container.

These services are both run and managed with the help of supervisord. Here’s the supervisord config file that ensures NginX and uWSGI are both running:

[supervisord]
nodaemon=true

[program:uwsgi]
# Run uWSGI with custom ini file
command=/usr/local/bin/uwsgi --ini /etc/uwsgi/uwsgi.ini
stdout_logfile=/dev/stdout
stdout_logfile_maxbytes=0
stderr_logfile=/dev/stderr
stderr_logfile_maxbytes=0

[program:nginx]
# NginX will use a custom conf file (ref: Dockerfile)
command=/usr/sbin/nginx
stdout_logfile=/dev/stdout
stdout_logfile_maxbytes=0
stderr_logfile=/dev/stderr
stderr_logfile_maxbytes=0

Launching a Docker container

I’m not including the instructions on installing Docker in this post (a good place to get started is here)

With the above project set up and Docker installed, the next step is to actually launch a Docker container based on the above image definition.

Frist, the Docker image must be built. In this example, I’ll tag (name) the image as “myapp”. In whatever terminal/shell is available on the machine you’re using (I’m running the Mac terminal), run the following command:

$ docker build -t myapp .

Next, run a container based on the above image using one of the following commands:

# run Docker container in interactive terminal mode - this will print logs to the terminal stdout, hitting command+C (or Ctrl+C etc) will kill the container
$ docker run -ti -p 80:80 myapp

# run Docker container quietly in detached/background mode - the container will need to be killed with the "docker kill" command (see next code block below)
$ docker run -d -p 80:80 myapp

The above commands will direct traffic to port 80 on the host machine to the Docker container’s port 80. The Python app should now be accessible on port 80 on localhost (i.e. open http://localhost/ in a browser on the host machine).

Here are some helpful commands to see what’s going on with the Docker container and perform any required troubleshooting:

# list running Docker containers
$ docker ps


# show logs for a specific container
$ docker logs [container ID]


# connect to a Docker container's bash terminal
$ docker exec -it [container ID] bash


# stop a running container
$ docker kill [container ID]


# remove a container
$ docker rm [container ID]


# get a list of available Docker commands
$ docker --help

Docker Compose

Note: the example included in this section is contained in this GitHub repo: https://github.com/abakonski/docker-compose-flask
As above, the example here is minimal.

The above project is a good start, but it’s a very limited example of what Docker can do. The next step in setting up a microservice infrastructure is through the use of Docker Compose. Typically, most apps will comprise multiple services that interact with each other. Docker Compose is a pretty simple way of orchestrating exactly that. The concept is that you describe the environment in a YAML file (usually named docker-compose.yml) and launch the entire environment with just one or two commands.

This YAML file describes things like:

  • The containers that need to run (i.e. the various services)
  • The various storage mounts and the containers that have access to them – this makes it possible for various services to have shared access to files and folders
  • The various network connections over which containers can communicate with each other
  • Other configuration parameters that will allow containers to work together
version: '3'

services:
  redis:
    image: "redis:alpine"
    ports:
      - "6379:6379"
    networks:
      - mynet

  web:
    build: .
    image: myapp:latest
    ports:
      - "80:80"
    networks:
      - mynet

networks:
  mynet:

The above YAML file defines two Docker images that our containers will be based on, and one network that both containers will be connected to so that they can “talk” to each other.

In this example, the first container will be created based on the public “redis:alpine” image. This is a generic image that runs a Redis server. The “ports” setting is used to open a port on the container and map it to a host port. The syntax for ports is “HOST:CONTAINER”. In this example we forward the host port 6379 to the same port in the container. Lastly, we tell Docker compose to put the Redis container on the “mynet” network, which is defined at the bottom of the file.

The second container defined will be based on a custom local image, namely the one that’s outlined in the first section of this article. The “build” setting here simply tells Docker Compose to build the Dockerfile that is sitting in the same directory as the YAML file (./Dockerfile) and tag that image with the value of “image” – in this case “myapp:latest”. The “web” container is also going to run on the “mynet” network, so it will be able to communicate with the Redis container and the Redis service running within it.

Finally, there is a definition for the “mynet” network at the bottom of the YAML file. This is set up with the default configuration.

This is a very basic setup, just to get a basic example up and running. There is a ton of info on Docker Compose YAML files here.

Once the docker-compose.yml file is ready, build it (in this case only the “web” project will actually be built, as the “redis” image will just be pulled from the public Docker hub repo). Then bring up the containers and network:

# build all respective images
$ docker-compose build

# create containers, network, etc
$ docker-compose up

# as above, but in detached mode
$ docker-compose up -d

Refer to the Docker commands earlier in this article for managing the containers created by Docker Compose. When in doubt, use the “–help” argument, as in:

# general Docker command listing and help
$ docker --help

# Docker network help
$ docker network --help

# Help with specific Docker commands
$ docker <command> --help

# Docker Compose help
$ docker-compose --help

So there you have it – a “Hello World” example of Docker and Docker Compose.

Just remember that this is a starting point. Anyone diving into Docker for the first time will find themselves sifting through the official Docker docs and StackOverflow forums etc, but hopefully this post is a useful intro. Stay tuned for my follow-up posts that will cover deploying containers into Docker Swarm on Azure and then setting up a full pipeline into Docker Swarm using Jenkins and BitBucket.

If you have any feedback, questions or insights, feel free to reach out in the comments.

~ Andrew @ Veryfi.com

About Veryfi

Veryfi is a Y Combinator company (W17 cohort). Located in San Mateo (CA) founded by an Australian, Ernest Semerda, and the 1st Belarusian to go through Y Combinator, Dmitry Birulia.

Veryfi provides mobile-first, HIPAA-compliant bookkeeping software that empowers business owners by automating the tedious parts of accounting through AI and machine learning.

To learn more please visit https://www.veryfi.com

Linux Server Security Checklist

Had this sitting around in my Google Docs for some time. Good idea to share these Linux security tips to help others secure their boxes. So here it is peeps.

Linux security – paranoid check-list

  1. For direct access to your box, only use ssh. SSH is the most secure standard for both authentication (both host and user) and data protection (everything strongly encrypted, end-to-end).
  2. Enable key-pairs as the only way to access your box. Don’t allow passworded logins. Most passwords are too short and sit (even if in hashed form) on many databases: your bank, your favorite retailer etc. My guide on SSH setup will guide you through this by setting in sshd_config.
    PasswordAuthentication no
  3. Run ssh on a high port. The reason is that a lot of security scanners will only scan the standard known-service ports or the lower range (1-1024 are privileged ports that only superuser can bind/listen to, so they are more attractive to hackers) So running on 43256 (there are 2^16 =~ 65k ports) is much safer. Using an IP Address Lookup Tool can also add another layer of security.
  4. In the firewall rules, limit access to your (and your customers) IP blocks, i.e. instead of 0.0.0.0/0 (all the internet) allow only from (say) 12.167.110.0/24 (specific block)
  5. Control the users who are allowed entry to your server.
    sudo nano /etc/ssh/sshd_config
    AllowUsers username1 username2
  6. Never ever permit root logins:
    sudo nano /etc/ssh/sshd_config
    PermitRootLogin no
  7. All administrative stuff is done as a known user (accountability) which used ‘sudo’ after you have authenticated in via SSH.
  8. Use a second layer firewall (software firewall) in case the first goes down. On Linux you can use iptables with Gufw, one of the easiest firewall in the world, to manage the iptables.
    sudo apt-get install gufw
  9. Run logcheck, a periodic system log scanning that will email you any unusual event. logcheck comes with a very large rule-set of what can be safely ignored so it only emails when something really new and different shows up in the logs.
    sudo apt-get install logcheck
    sudo nano /etc/logcheck/logcheck.conf
    # Add your email to SENDMAILTO
    sudo -u logcheck logcheck # run a test
  10. Run tripwire, a service that scans all the executables on the system, and alerts when a signature has changed (i.e. the file has been replaced). There is also a good post here on Setting up Tripwire in Ubuntu 11.10 – Intrusion Detection System.
    sudo apt-get install tripwire

And that’s a wrap! Are there any others you would recommend?

~ Ernest

How to: SSH secure key authentication on Ubuntu

Open SSH is the most widely used SSH server on Linux. Using SSH, one can connect to a remote host and gain a shell access on it in a secure manner as all traffic is encrypted.

A neat feature of open SSH is to authenticate a user using a public/private key pair to log into the remote host. By doing so, you won’t be prompted for the remote user’s password when gaining access to a protected server. Of course you have to hold the key for this to work. By using key based authentication and by disabling the standard user/password authentication, we reduce the risk of having someone gaining access to our machine/s. For more info on data access management, visit sites like https://cyral.com/data-access-governance/. And if you need comprehensive visibility to enable compliance and secure data sharing, you might want to read more here to learn more. Moreover, if you need Cyber Security Solutions in charge of data protection of your company, you may look for a time-limited privileged access management system that evaluates each access request. You may click here to find out more.

Implement NIST Cybersecurity Framework in 3 weeks using CyberArrow. CyberArrow is a technology first solution that automates the evidence collection for NIST CSF controls. CyberArrow can be used by any type of organization.

So if you are not using SSH with public/private key pair, here is how to get this rolling. If you are using AWS (Amazon Web Services) you would have been forced to use this method. This is great! The instructions below will teach you a bit about this and provide insight into setting this up on your dev VM or a server which doesn’t have this level of security turned on.

Useful commands to note

Accessing server using key

ssh -i ./Security/PRIVATEKEY USERNAME@SERVER -p PORT

Example:

ssh -i ./Security/aws/myname_rsa root@127.0.0.1 -p 22345

Restart SSH server

sudo /etc/init.d/ssh restart

Install & Setup SSH Security Access

Note: This section is for admins only.

On your Server (remote host) Locally on your box
1. Install SSHOnly if not already installed.
sudo apt-get install openssh-server
sudo apt-get install openssh-client

Make sure you change your server (and firewall is present) it to listen on port 22345 (or similar port of your liking in the high range) vs the standard unsecure 22.

Via Shell

sudo nano /etc/ssh/sshd_config
sudo /etc/init.d/ssh restart

OR

In Webmin >SSH Server > Networking > Listen on port = 22345

How to install Webmin instructions are here: http://www.theroadtosiliconvalley.com/technology/building-ubuntu-lamp-web-server-vm/

On your Server (remote host) Locally on your box
2. Create a public/private key pair.
ssh-keygen -t rsa

This will generate the keys using a RSA authentication identity of the user. Why RSA instead of DSA? RSA is 2048 bit key vs DSA 1024 bit key restricted. Read here: http://security.stackexchange.com/questions/5096/rsa-vs-dsa-for-ssh-authentication-keys

By default the public key is saved in the file:~/.ssh/id_rsa.pub,
while private key is:~/.ssh/id_rsaeg.

3. Copy the generated myname_rsa.pub file to the remote host. Use SFTP and from:
/Users/name/.ssh/myname_rsa.pub drop it into remote host path:
/root/.ssh/myname_rsa.pubNote: If that folder doesn’t exist then create it.
sudo mkdir /root/.ssh/
On your Server (remote host) Locally on your box
4. SSH into remote host and append it to ~/.ssh/authorized_keys by entering:
cat /root/.ssh/myname_rsa.pub >> ~/.ssh/authorized_keys
rm /root/.ssh/myname_rsa.pub
4.1. Check the permissions on the authorized_keys file.Only the authenticated user should have read and write permissions. If the permissions are not correct change them by:
chmod 600 ~/.ssh/authorized_keys
5. Enable SSH public/private key pair access.
sudo nano /etc/ssh/sshd_config

Make sure you have the following:RSAAuthentication yesPubkeyAuthentication yesSave when exiting.

6. Reload new configuration.
/etc/init.d/ssh reload (or)
service ssh reload
On your Server (remote host) Locally on your box
7. Protect your private key file.Locally on your machine assuming you moved the private key file to folder ./Security/
chmod 0600 ./Security/myname_rsa
8. Test your new setup.Login to your remote host from your machine:
ssh -i ./Security/KEYFILE USERNAME@SERVER -p PORTNO

where ./Security/KEYFILE is the location of your private key file.eg.

ssh -i ./Security/myname_rsa root@1.1.1.1 -p 22345

You should be granted access immediately without password requirements.

On your Server (remote host) Locally on your box
9. Disable authentication by password.
sudo nano /etc/ssh/sshd_config

Make sure you have the following:

ChallengeResponseAuthentication no 
PasswordAuthentication no
UsePAM no

Save when exiting.

10. Reload new configuration.
/etc/init.d/ssh reload (or)
service ssh reload
On your Server (remote host) Locally on your box
11. Test #2 your new setupLogin to your remote host from your machine:
ssh -i ./Security/KEYFILE USERNAME@SERVER -p PORTNO

where ./Security/KEYFILE is the location of your private key file.eg.

ssh -i ./Security/myname_rsa root@1.1.1.1 -p 22345

You should be granted access immediately without password requirements.Also test using the old method which should prohibit access.

ssh root@1.1.1.1 -p 22345

Should yield: Permission denied (publickey).
Server is now protected against brute-force attacks.

Finally make sure you adjust your development tools so they tool can gain access to your secured server.

Tools

Your choice of tools my vary but the process is very similar. The following are my most used tools and how to tweak them to allow SSH key entry to my secured server.

FileZilla – SFTP

To enable FileZilla to access the server under the new configuration do this:

  1. FileZilla > Preferences…
  2. Settings window opens. Select “Connection > SFTP” (left hand navigation).
  3. In the right pane, click on “Add keyfile…”. Navigate to your private keyfile and click on it to add.
  4. You may be asked by FileZilla to “Convert keyfile” to a supported FileZilla format. This is fine and just click “Yes”. Save the output file to the same location as your private key file.
  5. Click OK on the Settings file to save final changes.

SublimeText2 – IDE

To enable SublimeText2 to access the server under the new configuration do this.

In your solutions sftp-settings.json configuration file enable key file access like this:

"ssh_key_file": "~/.ssh/id_rsa",

Example:

"ssh_key_file": "~/Security/myname_rsa",

And that’s it. Happy development!

~ Ernest

Outsourcing software development: pros and cons

Outsourcing part of software engineering is not for everyone. Outsourcing requires a lot of micromanagement and software engineering background to make sure that what you ask for is what you get.

What follows is my own experience over the last 10 years in many outsourcing contracts working across India, China and Eastern Europe outsources both independent and agencies.

Are you sure it’s for you?

Never “palm off” the job in the form of outsourcing. Otherwise you will be heading down a spiral. Because the important piece of outsourcing is both micromanaging and understanding what the fuck is getting delivered. This way you can either pull the plug on crappy code or influence the right sort of implementation.

If you outsource too early or the core IP you lose the power to radically change the design of your product. Early design is constantly changing especially if you are building something which has never been done before. You want the flexibility to change fast. You need to be under control and know what is going on with all the moving pieces. Read more on this how bad outsourcing impacted Boeing’s Dreamliners (787’s).

This leads me to some key points on what skills you should have if you are going to outsource. Mind you I said “you” because it cannot be someone else you palm it off to.

1. Have a strong background in software engineering.

Loose coupling, Less code, Don’t repeat yourself (DRY), explicit is better than implicit, Test-driven development (TDD), Distributed Version Control System (DVCS), and what .Net develops is all important. Did you understand any of those? If not then you are going to get a piece of crap code. Why is code important? Because it determines the type of engineering culture you build out internally & future maintenance (this is where the hard costs nail you down) and local hiring – quiet frankly great engineers do not like working in a pile of mess.

If you do not know how to code move on or go and learn to code. Anyone with the right attitude and time today can learn to code. See http://www.codecademy.com/, http://www.udacity.com/, https://developers.google.com/university/, etc… plenty of resources online for free. No excuses.

If the outsources delivers crap code you tell them to fix it. If they continue to deliver crap code. You break the contract and provide constructive feedback to them.

Detail detail detail. “The devil is in the detail.” my previous biz partner stressed this to a point where it is now embedded into my psyche and into how I work.

If you are outsourcing make sure that you or the person working 1:1 with the outsourcer are very detail orientated. This way errors are caught fast and stopped at the front line, and where appropriate move fast and fire the outsourcer.

2. People skills

If you have a background working with people (we all do right) and managing those people (oh here we go) then this part will also get smoother. You need to understand you are working with people who have their own lives, family, goals and ambitions etc… so don’t be an ass because you outsourced a piece of work to a “cheaper” labor country.

If it helps, review (even if you have already read it) How to Win Friends and Influence People by Dale Carnegie. The 3 basic principles:

  • Don’t criticize, condemn, or complain.
  • Give honest and sincere appreciation.
  • Arouse in the other person an eager want.

Look, you are going to have to micromanage them. Yes micromanagement ain’t ideal for your immediate employees but for contractors it is a must. They are paid to do a certain job and usually move on. You need to receive quality (refer to point 1 on engineering) and also make sure commitments are completed on time and within budget. Hence the micromanagement.

I also like to emphasize to build a good relationship so you can work with them again. Obviously pending the results of your encounter. Results is all that matter at the end of the day. But, never lose sight of maintaining that level of expected quality. If it drops, give them a chance to correct it by providing constructive feedback. If nothing changes again, then cut the tie immediately.

Remember: “Once shame on you, twice shame on me” (in 1st person)

Right so you have the necessary skills to get moving. Here is where the harder stuff begins.

The checklist!

1. Automate.

As much as you can. Outsourcing isn’t just relationship management. There are a number of balls in the air from managing the relationship to code review & feedback to product questions that need to be answered and/or fleshed out.

Use DVCS (ref my previous blog post) with email alerts enabled for code checkins, comments and issue tracking. Have everyone involved with the job on email alerts so you know when code is checked in or issues logged. I like using Bitbucket for all of this.

I also recommend you put them on HipChat for Private group chat and IM, business and team collaboration. This way you will maintain all communication in the one place.

2. The standards list.

Send the contractor your “standards list” of what you expect out of the engagement. Use Google Apps to write one up & share it if you do not have now. Put a line in the sand. A bar in front on:

  • Expected quality – DRY baby!,
  • Naming conventions,
  • Daily status updates – email or via HipChat,
  • Use of standard industry engineer practices like TDD else you will get code without unit tests!!
  • How everyone can reach each other for questions on product spec or similar ie. Skype, emails, cell #, HipChat etc. Include timezones everyone is working on.

3. Requirements.

Fuck sake man. More detail. Stipulate any API calls, use cases, designs, standards as mentioned above etc.. If you have an engineering background you will appreciate and say “fuck yeah” to what I just said.

No one likes to document things but this small initial investment will weigh in its worth when the final product is delivered to spec. Do not leave anything for misinterpretation.

  • Have a Balsamiq design illustrating all the screens you expect and how they should look.
  • Where applicable provide designs for every screen. Do not let the contractor try to work out for themselves what you want. Never ends well and you get billed for that time.
  • Technical detail around API calls (request & response) with examples, use cases, high levee flow diagram etc..

4. Understand it before you open your mouth.

If you are developing for a channel you have no experience in, ie. Android. Then spend time learning it from at least a “high level” understanding so you can speak the lingo and know when you are getting lied to in the face. If you level out with the lingo then you will get respected more and the contractor will not be able to pull a “shifty” on you.

5. Hiring.

Never straight forward and always requires a ton of work. But this pays off when you have the right contractor on board working with you.

  • Spend time writing up a detailed job spec and list it on oDesk/eLance and wait for the flood of offers. Immediately decline those that have not met all 5 stars criteria.
  • Setup a spreadsheet of all those that applied to keep track of who you short list, their contact details, your last communication with them etc… From the 100 narrow it down to top 20.
  • Interview the top 20 via Skype video (yes you need to see them) and listen for something that will differentiate one from the rest. For me it was getting asked questions I did not have an immediate answer to. Smart switched on engineers are like that and you know you got a winner there.

Remember that at every point in the interview/communication you need to be prepared with a series of questions so you can use those as a base for quality and comparison.

Tip: And when you do engage the outsourcer make sure you stay working via oDesk or similar tool. As much as you may be conned into believing working outside oDesk is worth 10% discount it isn’t  oDesk provides great tools to track your contractors time (with videos) and in the end you get to provide feedback on them. Bad business means bad comments means no future business. So it is in everyone’s favor to be on best terms and get the job done right.

6. Have fun!

Not a long-term strategy

Outsourcing is great when you first kick off a startup and need to fill in skill or time restraint gaps like kicking off a new channel which will interface with your in-house platform (your IP – which you built and are evolving) or design work. But that is where it stops.

Remember that outsourcing is work for hire. Your own company / startup is a labor of love which only you and those that live and breathe it each day share in the office. So if you have high expectations of the outsourcer to care and be on the ball with something they are building or have built then you most likely skipped the crucial part. The part where I told you to own the whole process and be laser focused on the work getting outsourced. You fucked up. You’re at fault not them.

Never outsource your core business. Only channels. Those that are not what I call IP (intellectual property). Your IP always stays in-house managed by you and your cofounder.. and ultimately a kickass in-house team. For example; a business that’s attractive to investors typically has some sort of IP that’s hard to clone by competitors. That thing that makes it unique. It could be a unique algorithm or even data. You’d never outsource that. Stuff that can be outsourced might be a channel eg. a mobile app as long as the IP (say that algorithm) is in the API your local team manages. For a smoother system consider using SD-WAN software as it gives you a better application system and more efficient business operations.

Final note

You are not looking for a “sweat shop”. Find rock stars! That have a history of delivering quality code on time while communicating effectively. Communication decides if you get an apple or an orange when all you wanted is an apple.

If you have any stories (good or bad) please share with me them below in the comments.

Happy outsourcing!
~ Ernest

PHP Coding Horrors and Excuses for Poor Decisions

Having coded in PHP for 7 years I feel I can give a balanced feedback on PHP. Today I mainly focus on Python & .NET because these languages have stood the test of time and allow me to attract great talent. I find it amusing that engineering leaders in established companies make backward decisions today to use PHP to power their business/core sites. Not to mention software engineer newbies falling prey to using it as their 1st language to experience software development & put theory into practice. So let’s explore this in more detail.

A quick story

Few years back while attending a Python class a young chap put up his hand, introduced himself as a long time PHP developer and asked the lecturer a question. “What is the difference between Python’s dictionary & lists to PHP’s arrays.”. Bang. This is exactly why I do not want newbies to go down that route. Data structures are fundamental to any software design. PHP will NOT force you to think about data structures when coding.. instead just stick a boot in your face and say walk.

As a leader

As a smart fast paced technology leader, you should NOT be suggesting or advising PHP as the company’s “language of choice”. If a company is using optimized wordpress hosting it’s typically for its blog (yes WordPress rocks), due to legacy reasons (we all learn right) or a variant of it. PHP is not even a great presentation language (so famous for years ago) lacking good support for a real templating engine. Going LAMP stack, as in Linux stack, is not about moving to PHP. Matter of fact LAMP stack is an old, beaten, used & abused lingo which means little today with the range of open source stacks that run on the Linux OS.

Let’s first look at what makes a good language. And if you are a leader looking at starting or moving to a new language this post should be enough to tell you what to avoid. Learn from other’s mistakes so you don’t have to make them yourself.

What makes a good language

  • Predictable
  • Consistent
  • Concise
  • Reliable
  • Debuggable

Check out the philosophies behind Python in Zen of Python on what a good language encourages.

PHP fails miserably here.

  • PHP is full of surprises: mysql_real_escape_string, E_ALL
  • PHP is inconsistent: strpos, str_rot13
  • PHP requires boilerplate: error-checking around C API calls, ===
  • PHP is flaky: ==, foreach ($foo as &$bar)
  • PHP is opaque: no stack traces by default or for fatals, complex error reporting.

PHP is NOT an enterprise language

An enterprise language is one that has good corporate support. Best example is Microsoft and their .NET platform.

Look at the support behind the PHP language. No corporation supports PHP’s growth & maturity like Sun & Google do for Java, Google (Guido van Rossum) for Python (jnc Django framework), Ruby (inc RoR) by 37 signals etc…

PHP is not supported by Yahoo. They failed to launch a version with Unicode support in the hyped up PHP6. And the father of PHP Rasmus Lerdorf is no longer based at Yahoo. Nor is PHP supported by Facebook. Facebook has been trying hard to move away from it’s aged roots and now compile PHP into C via HipHop – more on that below.

The mess that is PHP

There are plenty of websites covering the mess that is PHP. Just go and read them if you are still doubtful.

Some of those nasty PHP horrors

  • Unsatisfactory and inconsistent documentation at php.net.
  • PHP is exceptionally slow unless you install a bytecode cache such as APC or eAccelerator, or use FastCGI. Otherwise, it compiles the script on each request. It’s the reason Facebook invented HipHop (PHP compiler) to increase speed by around 80% and offer a just-in-time (JIT) compilation engine.
  • Unicode: Support for international characters (mbstring and iconv modules) is a hackish add-on and may or may not be installed. An afterthought.
  • Arrays and hashes treated as the same type. Ref my short story above.
  • No closures or first-class functions, until PHP 5.3. No functional constructs. such as collect, find, each, grep, inject. No macros (but complaining about that is like the starving demanding caviar.)  Iterators are present but inconsistently used.  No decorators, generators or list comprehension.
  • The fact that == doesn’t always work as you’d expect, so they invented a triple-equals === operator that tests for true equality.
  • include() can generate circular references and yield many unwanted and hard to debug problems. Not to mention its abuse to execute code that gets included.
  • Designed to be run in the context of Apache. Any back-end scripts have to be written in a different language. Long-running background process in PHP have to overwrite the global php ini.
  • PHP lacks standards and conventions.
  • There’s no standard for processing background tasks, such as Python’s Celery.

PHP presents 4 challenges for Facebook.

  • High CPU utilization.
  • High memory usage.
  • Difficult to use PHP logic in other systems.
  • Extensions are hard to write for most PHP developers.

Dont use Facebook as an excuse to have PHP as your core language.

Excuses for poor decision to use PHP

“But Facebook is all PHP.”

Boo hoo. Is that what your decision was based on? Seriously? It is well documented that Facebook uses PHP due to legacy reasons. It is what Mark Zuckerberg used in his dorm nearly a decade ago and somehow it stuck around. Later a top FB engineer called Haiping Zhao released HipHop literally rewriting the entire PHP language thus avoiding the worst attributes of the language. Since 2007 alone, Haiping named four failed attempts to move to Python (twice), to Java, to C++. The reason this did not work is due to incumbent inertia (it’s what’s there).

So you see it is not the same PHP you are coding in but a far superior subset of it customized for Facebook process & development efforts. PHP at Facebook was a mistake that had been corrected to some degree. Today the preferred strategy at Facebook is to write new components in a de-coupled manner using a better language of choice (C++, python, Erlang, Java, etc); this is easily facilitated by Facebook’s early development of thrift, an efficient multi-language RPC framework.

“But Yahoo is all PHP.”

Seriously? Shall we even go into this. A sinking Titanic that started its life as a manually maintained directory site. Today’s online apps are more advanced, demand high concurrency and dynamic nature – something more advanced languages are capable of delivering.

 “But Zynga (a large gaming company) uses PHP.”

At the time Zynga started developing for the platform, there was no other official Facebook SDK available except for the PHP one. Naturally Zynga started its life on Facebook. The rest is history.

Looking for a better language? Guess! ~ Yes I drew that by hand 🙂 Hope you like it!

Technology breeds culture

Bring a bunch of core PHP developers (those that only know this language) on board and you get what you pay for. Someone that can hack a script and not really understand the fundamentals of software design & engineering.

Think about this. Your valued assets are the staff (people in your company). And the staff will naturally come from companies and/or backgrounds/experiences will align with the technology decisions you made.

How about rewriting your code base in another language?

There is also a lot of industry precedent (Netscape case or Startup Suicide) indicating that re-writing an entire codebase in another language is usually one of the worst things you can do. Either don’t make the mistake to go down the PHP route in today’s era or start thinking about introducing a new language into the stack for new projects. Having a hybrid setup is OK and actually allows you to iterate fast, gives something new to play for your engineering crew and should you ever need to switch stacks you are already half way there. Dont make the same mistakes Facebook did.

The only bits I like in PHP are its “save file, refresh page and there are your changes”. The language is “easy to use”, yes. It’s hard to figure out what the fuck it’s doing, though.

Happy coding!

~ Ernest