Avatar
🏃
achat android animaux anticor apple arnaque association biot bluesky bricolage canoe canyoning ccesoir chateau choisir-velo cinema citation citroen-c8 commission configuration consomacteur course courses courses-dobstacles cuisine cv-pro dawarich debian dell depannage dev docker don dsi désinformation economie facebook fedora firewall football garmin gohugo gravel hebergement home-assistant hugo humhub inondation internets iphone-ipad jeux justice kali linux livre loisirs mac marathon mastodon mecanique misere momes-dazur musee-loisirs nice-matin non-classe oracle padel petition peugeot-206 peugeot-307 peugeot-406 photos php politique postgresql pro proxmox randonnee randonnée raspberry redhat redis republique-numerique reseaux-sociaux sante-internets sante-loisirs securite-internets self-hosted ski ski de randonnée sortir soutien spam sport-biot-2 sports stockage theatre trail twitter/x twitter/x ubuntu velo velo-biot velo-sur-biot via-ferrata ville villes visites voyage vtt windows wordpress wordpress-fr xbox élections municipales 0189xxxxxx 10-km 10-miles 10km 12-km 140 2014 2kv 307 406 406-hdi 920 920xt 930 935 abus acpi actif azur adb adhesion adsl agay age air aix-en-provence alcatel allianz allianz-riviera alpe-dhuez alpes-maritimes alpilles alternateur amap amazon amp anaconda analyse analytics android animaux anote anthea anthea-fr antibes anticor antutu apache2 apn apple applepi-baker april arcep arfi arnaque aroooo aseb-am asics association assurantprotection.fr asterisk asus asus-zenfone atos atsemtex audit auron authentique-fr autoroute avis awesome-note awk awox baignade balade balade-fr barcelonnette barnier base-de-donnee bash basket-ball belt benchmark bento bentomedia betrail biellette bigreen bilan bilan professionnel bilan sportif bilan-fr bio biot biot-fr biot-vernir bitdefender bkl-l09 blacklist blender blockchain-fr blog blogger-com bluesky bonifacio bormes-les-mimosas bot brague braun breil-sur-roya brew bricolage brm brm200 bruxelles bueil bug c cadolive cagnes-sur-mer caille caisse-depargne calanques calencon calendrier camargue cannes cannes-fr canoe cantabrie canyon canyoning cap-dantibes cardio carlit carnaval-de-nice carnaval-de-nice-fr carros carroz casa castellane castellar cat catamaran caussols ccbot ccc cdisplay ceinture cevennes chamonix chateau chaussures cheiron choisir velo cinema circuit cisco citation citrix classement claudebot clignotant cloud cloud-act cloud-personnel cnews codegouv col connecté col de greoliere col-connecte col-de-la-madone col-de-turini colmars competition compression configuration connect consomacteur conspiracy watch convergence-azureenne coreos coronarivus coronavirus corse courmes course course-a-pied course-dobstacle course-dobstacles course-dorientation course-fr coursegoules courses-2 covid covid-19 cozy-cloud cozytouch cpu critique cross cuda cuenod curl cursor dansup dark-web data dataforseo dawarich dawarich.app day-journal debian decathlon decodex deletefacebook dell dello-sciliar-catinaccio delphi demi-yasso derby-de-la-meije developpement developper-tools-access diag diario diaro diaspora digikam digiposte distribution docker docteur-gsm-com dolceacqua dolomites domolites domotique don dourgne dsi duranus débats désinformation ecologie economie education el-capitan elasticsearch elk elm327 email endurain enphase ensol entreprises entretien epidemie escape-game escarene escroquerie espagne estrosi europe eurovelo evasion-fiscale evolution-a-faire exiftool export eze f-f-a facebook facture facture-deau-fr fakenews falicon fan fayence fedora fenix 7 ffa ffmpeg filebeat fillion fillon film-bon film-moyen filtre-a-gazole filtre-pollen fire firewall fittrackee folder foodwatch footing for forerunner forerunner-935 forerunner-945 forerunner-955 forgejo framasoft framasphere france culture france-3 france-soir fraude fraude-fiscale free free-mobile freebox frejus funchal gafams gaillac garageband garmin garmin-connect gavarnie gcc gelas geodes gilette git github glance gmail goaccess gohugo google google-analytic google-analytics google-search-console gopro gorbio gotify goudurix gouffre gourdon gourdon-fr gptbot gpx grafana grasse gravel graxx greoliere greolieres greolieres-les-neiges grub gréolières gtest guillon hadopi haproxy hautpoul hebergement high-trail-vanoise hintertux hipay hipay-com historique-des-dons holdup home-assistant homeassistant hommage honeywell honor hop howto htv huawei hugo huile-direction-assiste humhub ia ibm iconservicesagent ie ign ilonse imagento imageoptim imagneto immich imovie import impots indent injecteur inondation insee insta360 installation internet intimidation ipad ipad-argus iphone iphoto iptables isola-2000 issue italie itra itunes j2s jaime-courir java javascript jeedom jenkins jetpack jeux jeux-de-sophia jeux-de-sophia-antipolis joplin joseph journey jpegoptim juns jupyter justice kali karer-pass kayak kernel kibana kies-app kilometre-vertical kisskissbankbank kiwix klaxon kodi kokopelli korben la-brague la-capelette la-grave-de-peille la-poste la-quadrature-du-net la-vie lac-de-vens langage lantosque lapeyre laquadrature latemar lateral lautrec lcd le-monde leon les-cammazes les-visiteurs-du-soir let's encrypt lets-encrypt levens libreoffice libvirt lightmd linux liste livebox livre logement logiciel logstash loi-numerique lombricomposteur luberon lulu mac mac-os macjournal macos macos-high-sierra madere malade malware mamp mandelieu maps marathon marche mariadb markdown marseille mastodon matomo matomo-analytics matosdon matrix mazamet mcafee mecanique mediapart meduses meije meltdown memo menuiserie mercantour merci-michel mermaid meta metricbeat microsoft-teams migration misere mkdocs module mogrify mojave monsanto montagne-noire montagne-noire-fr montauroux motionpro mougins moulinet mouton msf mud-day mud-day-fr munin mysql mytf1 mytf1-fr mywellness myzone naiad naiad2020 natation natation-libre nature nautipolis neige netamo netatmo nextcloud nginx nice nice-fr nice-matin notes nsinvalidargumentexception ntfs nuxit nvidia objective-see obsolescence-programmee obstacle occ occasion odbii ok-google ollama olvid onedrive oopad open open-data open-sky open-source open-source-experience opencv opendata opensky opensky-a-valbonne opio oracle-linux orange origine-cycles orsiere osm france osmc oss-paris osx osxp osxp2024 osxp2025 outlook owncloud pac paca padel panorama panoramique paradisdiscount-com parc-du-paradou parc-naturel-dello-sciliar-catinaccio paris pascal paul paypal pdf peillon peinture petition peugeot peugeot-307 phare photo photon photos photovoltaique php phpnet phpnet-org pichauris pigeon pinterest pip piscine planning play-store plongee plu plugin pluviometrie pneu politique pollution polylang postfix postgresql ppri prejuges preparation print prix programmation prom-classic prometheus provence proxmox pssh publicite purge pyrenees python python3 qnap quartier qwant raid rameur-dinterieur rancheros randonnee randonnee-2 randonnees randonnée raspberry recette recette-fr redbysfr redhat redis redmi reparartion reparation reseau-sociaux resideo rest-api resultat reunion revue-de-presse rgpd rimplas rock64 root roquebilliere roubion rouret row rsyslog rt-france rubitrack rue89 saint-amancet-fr saint-avit saint-fereol saint-jeannet saint-vallier-de-thiey saintetic salade-nicoise-fr salle-serveur salon samsung san-remo sante sauter scop sd-card securite security.txt sed selection self-hosted semaine-de-la-critique semi semi-marathon semi-marathon-fr sentier seranon server-git serveur service-public siagne sidobre sierra signal sip ski ski de randonnée ski-de-randonnee skred skype slack smart-home smartphone smsc snap sommets sondage sondage-en-ligne sophia-antipolis soreze soreze-fr sortie-en-famille sospel soual spam spartan spartan-race spartan-race-fr spartian spectre sport sql sqlite ssd stable-diffusion stade station-meteo statistique statistiques stockage strava suisse suivi summary surf svn swap swarm syslog systemd tanneron tapform tarif tarn tarn-fr taxes telerama television temp temperature template templier tende tennis tennis-de-table tensorflow test testeur tests textwrangler theatre theoule-sur-mer thorenc thunderbird thunderbirds tignes tor tour-des-sangliers tour-du-sanglier tourisme tourrettes-sur-loup trail trail-de-la-vesubie transposh transvesubienne trashbusters travail travaux trial trifecta turbie tux tuxedo tuxedo-computer tuya twitter tyrol ubaye ubuntu ufc-que-choisir ultra uniq unknown unroot update urbain urban-trail urbanbiker utcam utelle utmb vae valbonne valeo vallee-des-merveilles valloire vallon-des-horts valmasque var vaucluse vaultwarden vauplane vegay velo velotaf vence ventoux veolia verdon vesubie vidange video villeneuve-loubet vim virtual-box virtualbox virus vma vmware vol voyage vpn vps vtc vtt wanderer wannacry webalizer wget whatsapp wifi wikimedia-foundation wikipedia wiko windows wine woocommerce wordfence wordpress x xbmc xbox yahoo-mail yolo zenfone zenpad zigbee
  • Here the script in Python : https://github.com/CYBERNEURONES/Python/blob/master/JoplinCleanRessource.py

    #
    # Version 1 
    # for Python 3
    # 
    #   ARIAS Frederic
    #   Sorry ... It's difficult for me the python :)
    #
    
    from time import gmtime, strftime
    import time
    import json
    import requests
    import os
    import sqlite3
    import re
    
    #conn = sqlite3.connect('my_db.db')
    find_this = "\(:/"
    
    #c = conn.cursor()
    #c.execute('''DROP TABLE LINK''')
    #conn.commit()
    #c.execute('''CREATE TABLE LINK (ID_NOTE text, ID_RESOURCE text, CHECKSUM_MD5 text)''')
    #conn.commit()
    
    #IP
    ip = "127.0.0.1"
    port = "41184"
    token = "Put the token here"
    nb_request = 0
    my_body = ""
    headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
    url_notes = (
        "http://"+ip+":"+port+"/notes?"
        "token="+token
    )
    nb_total_ressource = 0
    nb_local_ressource = 0
    ALL_ID = {}
    try:
        resp = requests.get(url_notes, headers=headers)
        nb_request += 1
        resp.raise_for_status()
        resp_dict = resp.json()
        #print(resp_dict)
        for my_note in resp_dict:
            nb_local_ressource = 0
            my_body = my_note.get('body')
            my_ressource = [m.start() for m in re.finditer(find_this, my_body)]
            for my_ressource_x in my_ressource:
                 nb_total_ressource += 1
                 nb_local_ressource += 1
                 my_ressource_id = my_body[my_ressource_x+3:my_ressource_x+32+3]
                 print(nb_local_ressource,":",my_note.get('id'),":",my_ressource_id)
                 ALL_ID[my_ressource_id]=my_note.get('id')
                 
                 #c.execute(sql_request)
                 #conn.commit()
    except requests.exceptions.HTTPError as e:
        print("Bad HTTP status code:", e)
    except requests.exceptions.RequestException as e:
        print("Network error:", e)
    
    nb_keep = 0
    nb_remove = 0
    url_resources = (
        "http://"+ip+":"+port+"/resources?"
        "token="+token
    )
    try:
        resp = requests.get(url_resources, headers=headers)
        nb_request += 1
        resp.raise_for_status()
        resp_dict = resp.json()
        #print(resp_dict)
        for my_resource in resp_dict:
            my_id = my_resource.get('id')
            if my_id in ALL_ID:
                print("Keep for notes",ALL_ID[my_id])
                nb_keep += 1
            else:
                print("Remove");
                nb_remove += 1
                url_resources_delete = (
        "http://"+ip+":"+port+"/resources/"+my_id+"?"
        "token="+token
    )
                try:
                     resp2 = requests.delete(url_resources_delete, headers=headers)
                     resp.raise_for_status()
                     nb_request += 1
                except requests.exceptions.HTTPError as e:
                     print("Bad HTTP status code:", e)
                except requests.exceptions.RequestException as e:
                     print("Network error:", e)
    except requests.exceptions.HTTPError as e:
        print("Bad HTTP status code:", e)
    except requests.exceptions.RequestException as e:
        print("Network error:", e)
    
    #conn.close()
    print("nb_request",nb_request,"nb_total_ressource : ",nb_total_ressource," nb_local_ressource : ",nb_local_ressource)
    print("nb_keep",nb_keep,"nb_remove",nb_remove);

    Here the result :

    joplin Created Fri, 01 Mar 2019 00:00:00 +0000
  • J’ai essayé de faire une installation de cmake, et j’ai eu l’erreur :

    $ brew install cmake
    /usr/local/Homebrew/Library/Homebrew/utils/lock.sh: line 27: /usr/local/var/homebrew/locks/update: Permission denied
    -e:1:in `initialize': Bad file descriptor (Errno::EBADF)
    	from -e:1:in `new'
    	from -e:1:in ' Error: Another active Homebrew update process is already in progress. Please wait for it to finish or terminate it to continue. Error: The following directories are not writable by your user: ...

    Pour fixer le problème j’ai fait :

    brew Created Thu, 28 Feb 2019 00:00:00 +0000
  • Quand j’essaye de faire le build de dlib j’ai l’erreur suivante :

    In file included from /private/var/folders/72/mwd843qs5dnfxxzc5zzwx5mw0000gn/T/pip-install-yx4dc86g/dlib/dlib/gui_widgets/fonts.cpp:16:
        /private/var/folders/72/mwd843qs5dnfxxzc5zzwx5mw0000gn/T/pip-install-yx4dc86g/dlib/dlib/gui_widgets/nativefont.h:27:10: fatal error: 'X11/Xlib.h' file not found
        #include <X11/Xlib.h>
                 ^~~~~~~~~~~~
        1 error generated.
        make[2]: *** [dlib_build/CMakeFiles/dlib.dir/gui_widgets/fonts.cpp.o] Error 1
        make[2]: *** Waiting for unfinished jobs....
        make[1]: *** [dlib_build/CMakeFiles/dlib.dir/all] Error 2
        make: *** [all] Error 2

    Pour fixer le problème :

    ln -s /opt/X11/include/X11 /usr/local/include/X11

    J’ai donc pu faire la compilation de dlib qui est utilisé par face_recognition :

    $ pip install face_recognition
    Collecting face_recognition
      Using cached https://files.pythonhosted.org/packages/3f/ed/ad9a28042f373d4633fc8b49109b623597d6f193d3bbbef7780a5ee8eef2/face_recognition-1.2.3-py2.py3-none-any.whl
    Requirement already satisfied: numpy in /usr/local/lib/python3.7/site-packages (from face_recognition) (1.16.1)
    Requirement already satisfied: Pillow in /usr/local/lib/python3.7/site-packages (from face_recognition) (5.4.1)
    Collecting dlib>=19.7 (from face_recognition)
      Using cached https://files.pythonhosted.org/packages/35/8d/e4ddf60452e2fb1ce3164f774e68968b3f110f1cb4cd353235d56875799e/dlib-19.16.0.tar.gz
    Requirement already satisfied: face-recognition-models>=0.3.0 in /usr/local/lib/python3.7/site-packages (from face_recognition) (0.3.0)
    Collecting Click>=6.0 (from face_recognition)
      Using cached https://files.pythonhosted.org/packages/fa/37/45185cb5abbc30d7257104c434fe0b07e5a195a6847506c074527aa599ec/Click-7.0-py2.py3-none-any.whl
    Building wheels for collected packages: dlib
      Building wheel for dlib (setup.py) ... done
      Stored in directory: /Users/.../Library/Caches/pip/wheels/ce/f9/bc/1c51cd0b40a2b5dfd46ab79a73832b41e7c3aa918a508154f0
    Successfully built dlib
    Installing collected packages: dlib, Click, face-recognition
    Successfully installed Click-7.0 dlib-19.16.0 face-recognition-1.2.3

    A suivre.

    Created Thu, 28 Feb 2019 00:00:00 +0000
  • Quand on fait un Takeout sur Google on a des fichiers Takeout-XX qui font 2 Go maximum, ensuite il faut recomposer le fichier Takeout de base. Pour faire cela sous Mac rien de plus facile, avant de copier il faut maintenir la touche Option ( c.a.d. ALT). On a alors dans le menu la possibilité de fusionner.

    google mac Created Sun, 24 Feb 2019 00:00:00 +0000
  • Il reste encore du travail à faire … mais cette application est vraiment très bien. J’ai fait énormément de test avec beaucoup de donnée, et je n’ai pas eu de problème.

    joplin Created Sat, 23 Feb 2019 00:00:00 +0000
  • La nouvelle version arrive avec quelques fix et des améliorations :

    On peut même faire du KaTeX !

    joplin Created Fri, 15 Feb 2019 00:00:00 +0000
  • J’ai voulu tester Joplin https://joplin.cozic.net entièrement, pas seulement la synchronisation de 2 ou 3 fichiers.

    J’ai donc fait une base de 2465 notes, et 9787 images :

    : "Total folders: 32"
    : "Total notes: 2465"
    : "Total resources: 9787"

    Mon fichier WebDEV :

    $ du -sh WebDAV/
    2,7G	WebDAV/
    $ ls -l WebDAV/*.md | wc -l
    -bash: /bin/ls: Argument list too long
           0

    Il y a tellement de fichier que la commande “ls” plante :) , en fait il y a 13051 fichiers pour 2,7 Go. Le fichier le plus gros fait 13 Ko.

    joplin Created Thu, 14 Feb 2019 00:00:00 +0000
  • Step 0 : Install Joplin and activate the REST API ( https://joplin.cozic.net/api/ ) .

    Step 1: Install nltk and worldcloud with pip ( for more information see https://www.datacamp.com/community/tutorials/wordcloud-python )

    Step 3.a : Run this scripts for Title (change the token)

    #
    # Version 1 
    # for Python 3
    # 
    #   ARIAS Frederic
    #   Sorry ... It's difficult for me the python :)
    #
    
    from time import gmtime, strftime
    import time
    import json
    import requests
    import os
    import nltk
    nltk.download('punkt')
    nltk.download('stopwords')
    from nltk.tokenize import word_tokenize
    from nltk.corpus import stopwords
    from wordcloud import WordCloud
    import numpy as np
    import matplotlib.pyplot as plt
    
    #IP
    ip = "127.0.0.1"
    #Port
    port = "41184"
    #Token
    token = "Put your token here"
    nb_request = 0
    my_title = ""
    headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
    url_notes = (
        "http://"+ip+":"+port+"/notes?"
        "token="+token
    )
    try:
        resp = requests.get(url_notes, headers=headers)
        nb_request += 1
        resp.raise_for_status()
        resp_dict = resp.json()
        #print(resp_dict)
        for my_note in resp_dict:
            #print(my_note.get('id'))
            my_title += my_note.get('title')
    except requests.exceptions.HTTPError as e:
        print("Bad HTTP status code:", e)
    except requests.exceptions.RequestException as e:
        print("Network error:", e)
    
    # Create a word cloud image
    stopwords = stopwords.words('french')
    wc = WordCloud(background_color="white", max_words=5000, stopwords=stopwords, contour_width=3, contour_color='firebrick')
    wc.generate(my_title)
    wc.to_file("jopling_title.png")
    plt.figure(figsize=[18,8])
    plt.imshow(wc, interpolation='bilinear')
    plt.axis("off")
    plt.show()

    Step 3.b : Run this scripts for Body (change the token)

    joplin Created Thu, 14 Feb 2019 00:00:00 +0000
  • Awesome Note 2, it’s very popular on iPad :

    The new All-in-one Organizer, Awesome Note 2 is integrated with note and schedule management.
    And now it’s available!!

    WONDERFUL WRITING FEATURES
    · It can be used not only for simple notes, but also rich and wonderful writing tool.
    · Make notes even more powerful to add photos, voice recording and drawings.
    · Easily create diary notes to display feeling, weather or road map information.

    joplin migration python Created Thu, 14 Feb 2019 00:00:00 +0000
  • Step 0 : Install Joplin and activate the REST API ( https://joplin.cozic.net/api/ ) .

    Step 1: Install gmplot with pip

    $ pip install gmplot
    Collecting gmplot
      Downloading https://files.pythonhosted.org/packages/e2/b1/e1429c31a40b3ef5840c16f78b506d03be9f27e517d3870a6fd0b356bd46/gmplot-1.2.0.tar.gz (115kB)
        100% |████████████████████████████████| 122kB 1.0MB/s 
    Requirement already satisfied: requests in /usr/local/lib/python3.7/site-packages (from gmplot) (2.21.0)
    Requirement already satisfied: urllib3<1.25,>=1.21.1 in /usr/local/lib/python3.7/site-packages (from requests->gmplot) (1.24.1)
    Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.7/site-packages (from requests->gmplot) (2018.11.29)
    Requirement already satisfied: idna<2.9,>=2.5 in /usr/local/lib/python3.7/site-packages (from requests->gmplot) (2.8)
    Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /usr/local/lib/python3.7/site-packages (from requests->gmplot) (3.0.4)
    Building wheels for collected packages: gmplot
      Building wheel for gmplot (setup.py) ... done
      Stored in directory: /Users/...../Library/Caches/pip/wheels/81/6a/76/4dd6a7cc310ba765894159ee84871e8cd55221d82ef14b81a1
    Successfully built gmplot
    Installing collected packages: gmplot
    Successfully installed gmplot-1.2.0

    The source code : (change your token)

    joplin python Created Wed, 13 Feb 2019 00:00:00 +0000