diff --git "a/ghv10_javascript-10.json" "b/ghv10_javascript-10.json"
new file mode 100644--- /dev/null
+++ "b/ghv10_javascript-10.json"
@@ -0,0 +1 @@
+{"data": [{"name": "jflyfox/jfinal_cms", "link": "https://github.com/jflyfox/jfinal_cms", "tags": ["jfinal", "beetl", "mysql", "javaweb", "cms"], "stars": 598, "description": "jfinal cms\u662f\u4e00\u4e2ajava\u5f00\u53d1\u7684\u529f\u80fd\u5f3a\u5927\u7684\u4fe1\u606f\u54a8\u8be2\u7f51\u7ad9\uff0c\u91c7\u7528\u4e86\u7b80\u6d01\u5f3a\u5927\u7684JFinal\u4f5c\u4e3aweb\u6846\u67b6\uff0c\u6a21\u677f\u5f15\u64ce\u7528\u7684\u662fbeetl\uff0c\u6570\u636e\u5e93\u7528mysql\uff0c\u524d\u7aefbootstrap\u6846\u67b6\u3002\u652f\u6301oauth2\u8ba4\u8bc1\u3001\u5e10\u53f7\u6ce8\u518c\u3001\u5bc6\u7801\u52a0\u5bc6\u3001\u8bc4\u8bba\u53ca\u56de\u590d\uff0c\u6d88\u606f\u63d0\u793a\uff0c\u7f51\u7ad9\u8bbf\u95ee\u91cf\u7edf\u8ba1\uff0c\u6587\u7ae0\u8bc4\u8bba\u6570\u548c\u6d4f\u89c8\u91cf\u7edf\u8ba1\uff0c\u56de\u590d\u7ba1\u7406\uff0c\u652f\u6301\u6743\u9650\u7ba1\u7406\u3002\u540e\u53f0\u6a21\u5757\u5305\u542b\uff1a\u680f\u76ee\u7ba1\u7406\uff0c\u680f\u76ee\u516c\u544a\uff0c\u680f\u76ee\u6eda\u52a8\u56fe\u7247\uff0c\u6587\u7ae0\u7ba1\u7406\uff0c\u56de\u590d\u7ba1\u7406\uff0c\u610f\u89c1\u53cd\u9988\uff0c\u6211\u7684\u76f8\u518c\uff0c\u76f8\u518c\u7ba1\u7406\uff0c\u56fe\u7247\u7ba1\u7406\uff0c\u4e13\u8f91\u7ba1\u7406\u3001\u89c6\u9891\u7ba1\u7406\u3001\u7f13\u5b58\u66f4\u65b0\uff0c\u53cb\u60c5\u94fe\u63a5\uff0c\u8bbf\u95ee\u7edf\u8ba1\uff0c\u8054\u7cfb\u4eba\u7ba1\u7406\uff0c\u6a21\u677f\u7ba1\u7406\uff0c\u7ec4\u7ec7\u673a\u6784\u7ba1\u7406\uff0c\u7528\u6237\u7ba1\u7406\uff0c\u89d2\u8272\u7ba1\u7406\uff0c\u83dc\u5355\u7ba1\u7406\uff0c\u6570\u636e\u5b57\u5178\u7ba1\u7406\u3002", "lang": "JavaScript", "repo_lang": "", "readme": "jfinal cms\r\n------------------------\r\n\r\n> 1. jfinal cms, using the simple and powerful JFinal as the web framework, the template engine uses beetl, the database uses mysql, and the front-end bootstrap framework.\r\n> 2. The background module includes: column management, column announcement, column scrolling pictures, article management, reply management, feedback, my album, album management, picture management, album management, video management, cache update, friendship link, visit statistics , Contact management, template management, organization management, user management, role management, menu management, parameter configuration, data dictionary management.\r\n> 3. Back-end template support: bootstrap default style, bootstrap black style and flat-ui style\r\n> 4. Front-end template support: default content release, official website template, picture template and video template\r\n> 5. jfinal cms communication group: 568909653. For documentation, see doc/jfinal cms documentation.docx\r\n\r\n* Management address: http://${ip:port}/${project_name}/admin\r\n* Test account: admin/admin123 or test/123456\r\n\r\nPlatform Deployment and Configuration Instructions\r\n------------------------\r\n\r\n> 1. Download the project code, install jdk, maven, mysql.\r\n> 2. Run mvn install in the project directory and prompt BUILD SUCCESS.\r\n> 3. Create a mysql user and database, run /jfinal_cms/sql corresponding to jfinal_cms_v4.sql.\r\n> 4. Database configuration file: /jfinal_cms/src/main/resources/conf/db.properties\r\n> 5. If oauth2 is required, set src/conf/oauth.properties\r\n> 6. Run: mvn tomcat:run\r\n> 7. The system adopts single-site mode by default, and each site can be easily switched in the \"Site Management\" menu under \"Other Management\".\r\n> 8. If you use multi-site, you can change the item \"Multi-site Mark\" to true in the \"Multi-site Mark\" menu under \"System Management\".\r\n> 9. Multi-site needs to set the domain name corresponding to each site, and resolve to different site templates through the domain name.\r\n\r\n\r\nProject source address:\r\n------------------------\r\n\r\ngithub address: https://github.com/jflyfox/jfinal_cms\r\n\r\nCode cloud address: https://gitee.com/jflyfox/jfinal_cms\r\n\r\nAPI Clinet project source address:\r\n------------------------\r\n\r\ngithub address: https://github.com/jflyfox/jfinal_cms_api_client\r\n\r\nCode cloud address: https://gitee.com/jflyfox/jfinal_cms_api_client\r\n\r\nScreenshot of demo effect\r\n------------------------\r\n\r\n#### Website CMS address: [http://mtg.jflyfox.com/](http://mtg.jflyfox.com/) ####\r\n![Website](http://static.oschina.net/uploads/img/201601/21022316_Nk5M.gif \"jfinal cms\")\r\n\r\n#### Website official website template: [http://website.jflyfox.com/](http://website.jflyfox.com/) ####\r\n![Official website](http://static.oschina.net/uploads/img/201601/21022316_XkxY.gif \"jfinal cms\")\r\n\r\n#### Blog Template Template: [http://blog.jflyfox.com/](http://blog.jflyfox.com) ####\r\n![Official website](http://static.oschina.net/uploads/space/2016/0622/002206_Rla0_166354.jpg \"jfinal cms\")\r\n\r\n#### Album management template: [http://photo.jflyfox.com/](http://photo.jflyfox.com/) ####\r\n![Official website](http://static.oschina.net/uploads/space/2016/0306/144741_ldOJ_166354.gif \"jfinal cms\")\r\n\r\n#### Video management template: [http://video.jflyfox.com/](http://video.jflyfox.com/) ####\r\n![Official website](http://static.oschina.net/uploads/space/2016/0306/144754_FXhR_166354.gif \"jfinal cms\")\r\n\r\n#### Background page theme: ####\r\n![Background management](http://static.oschina.net/uploads/img/201601/28091447_rQtD.gif \"jfinal cms\")\r\n\r\nthank you\r\n------------------------\r\n\r\n 1. [JFinal](http://www.oschina.net/p/jfinal)\r\n 2. [beetl](http://ibeetl.com/community/)\r\n 3. [oschina](http://www.oschina.net/)\r\n\r\nproject support\r\n------------------------\r\n\r\n- The development of the project is inseparable from everyone's support~! ~\r\n\r\n- [Alibaba Cloud's latest event: the latest event on Double 11, as low as 10% off; there is also a gift package for newcomers; please click here] (https://www.aliyun.com/1111/2019/home?spm=5176.11533457.1089570.70.4fe277e3TKVLoB&userCode =c4hsn0gc)\r\n- 1 core 2G1M40G disk, 86 yuan/1 year,\r\n- 2-core 4G3M40G disk, 799 yuan / 3 years,\r\n- 2-core 8G5M40G disk, 1399 yuan / 3 years.\r\n\r\n- [Aliyun: ECS cloud server 2 discount; please click here] (https://www.aliyun.com/acts/limit-buy?spm=5176.11544616.khv0c5cu5.1.1d8e23e8XHvEIq&userCode=c4hsn0gc)\r\n- [Alibaba Cloud: ECS cloud server new coupon; please click here] (https://promotion.aliyun.com/ntms/yunparter/invite.html?userCode=c4hsn0gc)\r\n\r\n- You can also buy the author a cup of coffee :)\r\n\r\n![jflyfox](https://raw.githubusercontent.com/jflyfox/jfinal_cms/master/doc/pay01.jpg \"Open source support\")\r\n\r\n\r\ndonation list\r\n------------------------\r\n\r\n| Name | Amount | Remarks | Time |\r\n\r\n| :-------: |:----: | :-----:|----- |-----|\r\n\r\n| A Tao | \uffe5200.00 | Alipay Donation | 2018-09-17 13:41|\r\n\r\n| A Leng | \uffe5100.00 | WeChat Donation | 2018-08-23 11:03|\r\n\r\n| Ecstatic | \uffe550.00 | Alipay donation | 2018-01-12 18:10|\r\n\r\n| Ecstatic | \uffe550.00 | Alipay donation | 2018-01-12 18:10|\r\n\r\n| Origid | \uffe550.00 | Alipay donation | 2017-08-20 20:10|\r\n\r\n| Illusion | \uffe520.00 | WeChat Donation | 2017-07-18 18:34|\r\n\r\n| Weak water pierces the sky | \uffe550.00 | Alipay donation | 2017-04-28 10:17|\r\n\r\n| Niuniu | \uffe5100.00 | WeChat Donation | 2017-04-17 17:36|\r\n\r\n| Wheat Fields in Beijing in 2001 | \uffe550.00 | WeChat Donation | 2017-03-09 16:58|\r\n\r\n| You in this life | \uffe520.00 | Alipay donation | 2017-02-13 12:32|\r\n\r\n| Jianqiang | \uffe5500.00 | Alipay Donation | 2017-01-19 23:04|\r\n\r\n| Xiaojun | \uffe510.00 | Alipay Donation | 2016-11-30 22:58|\r\n\r\n| Xiaojun | \uffe510.00 | Alipay Donation | 2016-11-19 09:34|\r\n\r\n| Zhengzhou Yupin Electronic Commerce Co., Ltd. | \uffe5300.00 | Alipay donation | 2016-09-23 14:13|\r\n\r\n| Zhou Ketao | \uffe510.00 | Alipay donation | 2016-08-12 19:43|\r\n\r\n| Yang Mou | \uffe51.00 | Alipay Donation | 2016-06-29 14:12|\r\n\r\n| magicbug | \uffe5500.00 | Alipay donation | 2016-06-20 15:14|\r\n\r\n| Du Yuxuan | \uffe5100.00 | Alipay donation | 2016-05-29 10:48|\r\n\r\n| Xie Xiepai | \uffe520.00 | Alipay donation | 2016-05-01 22:33|| Dung and hair painted walls | \uffe51.00 | WeChat donation | 2016-04-17 21:18|\r\n\r\n| Hu Haifeng | \uffe510.00 | WeChat Donation | 2016-04-12 15:23|\r\n\r\n| Li Shoujing | \uffe510.00 | Alipay donation | 2016-03-10 17:20|\r\n\r\n| Han Qianye | \uffe520.00 | Alipay donation | 2016-03-05 18:35|\r\n\r\n| Gods down to earth | \uffe51.00 | WeChat donation | 2016-03-03 18:30|\r\n\r\n| Zhang Runshe | \uffe51.00 | Alipay donation | 2016-03-01 21:18|\r\n\r\n| Li Shengfa | \uffe520.00 | Alipay donation | 2016-02-23 22:25|\r\n\r\n| Jia Xiaolong | \uffe520.00 | WeChat Donation | 2016-02-20 15:20|\r\n\r\n| Han Ganglong | \uffe520.00 | Alipay donation | 2016-02-10 16:17|\r\n\r\n| Huang Ying | \uffe510.00 | Alipay donation | 2016-02-08 16:25|\r\n\r\n| Kong Weiyuan | \uffe520.00 | Alipay donation | 2016-02-07 18:40|\r\n\r\n| Little Pig | \uffe510.00 | WeChat Donation | 2016-02-07 18:20|\r\n\r\n| Field | \uffe520.00 | Alipay Donation | 2016-02-07 16:15|\r\n\r\n| Allen | \uffe510.00 | Alipay Donation | 2016-02-07 15:25|\r\n\r\n| Flying Dragon in the Sky | \uffe520.00 | WeChat Donation | 2016-02-05 15:20|\r\n\r\n| Qiu Guolin | \uffe55.00 | Alipay donation | 2016-02-05 16:17|\r\n\r\n| Li Rongfu | \uffe550.00 | Alipay donation | 2016-01-05 14:15|\r\n\r\n| Xia Shuzheng | \uffe51.00 | WeChat Donation | 2015-12-03 18:30|\r\n\r\n| Guo Junli | \uffe510.00 | Alipay donation | 2015-11-23 21:25|\r\n\r\n| Hou Shanzhi | \uffe55.00 | Alipay donation | 2015-11-10 12:30|\r\n\r\n| Dave | \uffe520.00 | WeChat Donation | 2015-09-05 15:10|\r\n\r\n| Li Gukuang | \uffe51.00 | Alipay donation | 2015-08-03 23:30|\r\n\r\n| Wen Ziyin | \uffe520.00 | WeChat donation | 2015-07-23 19:20|\r\n\r\n| He Wang | \uffe510.00 | Alipay Donation | 2015-07-10 17:29|\r\n\r\n| Li Xinge | \uffe520.00 | Alipay donation | 2015-05-07 20:00|\r\n\r\n| Su Mou | \uffe520.00 | Alipay donation | 2015-04-01 20:18|\r\n\r\n| Han Yun | \uffe510.00 | Alipay Donation | 2015-02-01 20:18|\r\n\r\n| lucky | \uffe510.00 | Alipay donation | 2015-01-20 15:10|\r\n\r\n| Wang Feng | \uffe510.00 | Alipay donation | 2015-01-10 22:00|", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "rizwansoaib/whatsapp-monitor", "link": "https://github.com/rizwansoaib/whatsapp-monitor", "tags": ["whatsapp-monitor", "online-tracker", "online-status", "whatsapp-online", "whatsapp-tracker", "android-whatsapp-tracker", "android-whatsapp-monitor", "whatsapp-notification", "smartphone-notification", "tracker-online", "whatsapp-contacts", "android-notification-service", "chrome-extensions", "browser-extension", "whatsapp-desktop", "whatsapp-desktop-client", "whatsapp-web-linux", "free", "free-online-tracker", "free-whatsapp-online-tracker"], "stars": 598, "description": "Whatsapp Online Tracker \ud83d\udcf2 | WhatsApp last seen tracker | [Get Notification \ud83d\udd14 and history \ud83d\udcdc of Online WhatsApp Contact]", "lang": "JavaScript", "repo_lang": "", "readme": "
\n\n\n \n\n [![](https://img.shields.io/badge/Browser%20Extension-WhatsApp%20Online%20Monitor-green)](https://addons.mozilla.org/en-US/firefox/addon/whatsapp-online-monitor/)\n [![Open Source](https://badges.frapsoft.com/os/v1/open-source.svg?v=103)](https://opensource.org/)\n [![Gitter](https://badges.gitter.im/whatsapp-monitor/Chat.svg)](https://gitter.im/whatsapp-monitor/Chat?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge)\n[![contributions welcome](https://img.shields.io/badge/contributions-welcome-brightgreen.svg?style=flat)](https://github.com/dwyl/esta/issues)\n\n \n
\n\n\n\n\n Whatsapp Monitor \nWhatsApp Contact Online Monitoring Tool \n\nFree WhatsApp Online Tracker App for Desktop and Browser Extension and get Notification in Your Android and iOS Devices \n\n\nWhen Your Contact will be online \ud83e\udd33 on Whatsapp \ud83d\udc40 get the notification \ud83d\udd14 in your Desktop \ud83d\udda5\ufe0f and cross-platform notification without any installed app in Smartphone \ud83d\udcf1 and more \n\n Features in v3.0 \n \n Play Sound \ud83d\udd09 When Contact online \n Online History \ud83d\udcdc Download in CSV and auto save csv \n Access History \ud83d\udce1 on web (Saved data on Server) \n Desktop Notification \ud83d\udd14 \n Cross Platform Notification \ud83d\udcf2 \ud83d\udcbb e.g. Android,Macbook,Linux,Windows (No need Install App) \n Concurrent Multiple Contacts \ud83d\udc6a Tracking \n Add Favourite Contacts Auto Open Chats \n \n\n\n \n \n\n\n\n \n Whatsapp Monitor Desktop v1.4 \n\n \n \n\n\n\n\n# Introduction\n\n[![](https://user-images.githubusercontent.com/29729380/83626193-b359f400-a5b2-11ea-87c9-ab6ab2e8376f.gif)](https://youtu.be/CrHjJIbBmKs)\n\n Extension \ud83c\udf10 \n Play Sound \ud83d\udd09 when Contact become Online \ud83e\udd33 and Web Push Notification \ud83d\udcf3 and Cross Platform Notification e.g. Android(No need to Install any App) Download \ud83d\udce5 History \ud83d\udcdc of Online in CSV Format \n\n[![video3](https://user-images.githubusercontent.com/29729380/108585182-048e5b80-736d-11eb-95f8-7340ab5d22d8.png)\n](https://youtu.be/CrHjJIbBmKs)\n\n\n \n\n Notification \ud83d\udd15 \n \n Extension (Cross Platform) \ud83d\udcf3 \n \n![mobile](https://user-images.githubusercontent.com/29729380/74177733-5384b980-4c60-11ea-8b86-a40267588194.png)\n \n Extension (Windows) \ud83d\udda5\ufe0f \n \n ![Windows](https://user-images.githubusercontent.com/29729380/74180771-115e7680-4c66-11ea-9939-b9eca7e4b646.png)\n \n \n \n# Faq (Frequently Asked Question)\n ### **Q1: I Have no Desktop. How can I use extension?**\n #### **Ans: Don't take tension we have designed this extension for Mobile support.**\n#### **You only need two devices one should android used for tracking**\n\n 1. Install [Kiwi Browser](https://play.google.com/store/apps/details?id=com.kiwibrowser.browser) in your Android\n \n 2. Install [Chrome Extension](https://chrome.google.com/webstore/detail/online-monitor/emkoflhmeegjlabodebpfbkeicjppebi/) in Kiwi Browser\n \n 3. Open WhatsApp Web in Desktop mode and run extension same as Desktop\n \n ### **Q2: Android extension how I will get notification?**\n #### **Ans: Use Notification Key and Subscribe any devices you will get Notification**\n \n \n\n \n \n\n\n\n \n \n \n \n \n Contributions \ud83d\udd90\ufe0f \n\n \n \n\n This program is free \ud83c\udd93 software: you can redistribute it and/or modify \ud83d\udc68\ud83c\udffb\u200d\ud83d\udcbb it under the terms of \n the GNU General \ud83d\udce2 Public License with Credit Mention of all contributors of repo in Your Project as published by the Free Software Foundation \ud83c\udf0e \n\n![License](https://user-images.githubusercontent.com/29729380/83224186-69c86e00-a19a-11ea-9783-37969dbf78b7.png)\n\n\n Legal \u2696\ufe0f \n This code \ud83d\udc68\ud83c\udffb\u200d\ud83d\udcbb is in no way affiliated \ud83d\udd17 with, authorized \u2714\ufe0f, maintained \ud83d\udd00, sponsored \ud83d\udc53 or endorsed \ud83d\udc4a by WhatsApp or any of its affiliates or subsidiaries. This is an independent and unofficial code \ud83e\uddd1\ud83c\udffb\u200d\ud83d\udcbb Use at your own risk \n \n Disclaimer \u26a0\ufe0f \n This software is for educational \ud83c\udfeb purpose only. \ud83d\udd75\ufe0f\u200d\u2642\ufe0f Keeping eye \ud83d\udc40 on a innocent person \ud83d\ude47 can make a person's life stressful \ud83d\ude1e and don't blackmail someone \ud83d\udcf4 for fun in your life. Give respect \ud83d\ude4f to privacy of every person \ud83d\udc6a \n \n\n\n \n\n Author \ud83e\uddd1\u200d\ud83d\udcbb \n\n If you \ud83d\udc4d the project, support us by \ud83c\udf1f Thank You \ud83d\ude4f \n\n\n", "readme_type": "markdown", "hn_comments": "Whatsapp Online Tracker calling [Get Notification bell and history scroll of Online]Track whatsapp contact online history", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "OnetapInc/chromy", "link": "https://github.com/OnetapInc/chromy", "tags": ["chrome", "node", "headless-chrome", "javascript", "browser", "nightmare", "casperjs"], "stars": 598, "description": "Chromy is a library for operating headless chrome. \ud83c\udf7a\ud83c\udf7a\ud83c\udf7a", "lang": "JavaScript", "repo_lang": "", "readme": "# Chromy\n\nChromy is a library for operating headless chrome.\n\nDocument Site: https://onetapinc.github.io/chromy/\n\nChromy is similar to Nightmare.js but has some differences:\n\n - Controlling Chrome via Chrome DevTools Protocol.\n - Supports mobile emulation.\n - No need to prepare a screen such as xvfb.\n\n## Requirements\n\n - Node 6 or later\n - Install Chrome60 or later to your machine before use Chromy.\n\nheadless mode is supported by Chrome59 or later.\n\n## Installation\n\n```bash\nnpm i chromy\n```\n\n## Usage\n\n```js\nconst Chromy = require('chromy')\n\n// not headless\n// let chromy = new Chromy({visible:true})\nlet chromy = new Chromy()\nchromy.chain()\n .goto('http://example.com/')\n .evaluate(() => {\n return document.querySelectorAll('*').length\n })\n .result((r) => console.log(r))\n .end()\n .then(() => chromy.close())\n```\n\nYou can also use async/await interfaces like this:\n\n```js\nconst Chromy = require('chromy')\n\nasync function main () {\n let chromy = new Chromy()\n await chromy.goto('http://example.com/')\n const result = await chromy.evaluate(() => {\n return document.querySelectorAll('*').length\n })\n console.log(result)\n await chromy.close()\n}\n\nmain()\n```\n\n### Mobile Emulation\n\nChromy provides mobile emulation. \nThe emulation changes a screen resolution, density, userAgent and provides touch emulation.\n\n```js\nconst Chromy = require('chromy')\n\nlet chromy = new Chromy()\nchromy.chain()\n .emulate('iPhone6')\n .goto('http://example.com/')\n .tap(100, 100) // emulate tap action by synthesizing touch events.\n .evaluate(() => {\n return navigator.userAgent\n })\n .result(console.log)\n .end()\n .then(() => chromy.close())\n```\n\n## FAQ\n\n[FAQ](https://github.com/OnetapInc/chromy/wiki/FAQ)\n\n## API\n\n * [Chromy(options)](#chromyoptions)\n * [.start(startingUrl = null)](#startstartingurl--null)\n * [.goto(url, options = {})](#gotourl-options--)\n * [.waitLoadEvent()](#waitloadevent)\n * [.userAgent(ua)](#useragentua)\n * [Chromy.addCustomDevice(device)](#chromyaddcustomdevicedevice)\n * [.emulate(deviceName)](#emulatedevicename)\n * [.forward()](#forward)\n * [.back()](#back)\n * [.inject(type, file)](#injecttype-file)\n * [.evaluate(func|source)](#evaluatefuncsourceargs)\n * [.result(func)](#resultfunc)\n * [.end()](#end)\n * [.exists(selector)](#existsselector)\n * [.visible(selector)](#visibleselector)\n * [.wait(msec)](#waitmsec)\n * [.wait(selector)](#waitselector)\n * [.wait(func)](#waitfunc)\n * [.sleep(msec)](#sleepmsec)\n * [.type(selector, text)](#typeselector-text)\n * [.insert(selector, text)](#insertselector-text)\n * [.check(selector)](#checkselector)\n * [.uncheck(selector)](#uncheckselector)\n * [.select(selector, value)](#selectselector-value)\n * [.setFile(selector, files)](#setfileselector-files)\n * [.click(selector, options)](#clickselector-options)\n * [.mouseMoved(x, y, options = {})](#mousemovedx-y-options--)\n * [.mousePressed(x, y, options = {})](#mousepressedx-y-options--)\n * [.mouseReleased(x, y, options = {})](#mousereleasedx-y-options--)\n * [.tap(x, y, options = {})](#tapx-y-options--)\n * [.doubleTap(x, y, options = {})](#doubletapx-y-options--)\n * [.scroll(x, y)](#scrollx-y)\n * [.scrollTo(x, y)](#scrolltox-y)\n * [.rect(selector)](#rectselector)\n * [.rectAll(selector)](#rectallselector)\n * [.defineFunction(func)](#definefunctionfunc)\n * [.on(eventName, listener)](#oneventname-listener)\n * [.once(eventName, listener)](#onceeventname-listener)\n * [.removeListener(eventName, listener)](#removelistenereventname-listener)\n * [.removeAllListeners(eventName)](#removealllistenerseventname)\n * [.screenshot(options= {})](#screenshotoptions-)\n * [.screenshotSelector(selector, options={})](#screenshotselectorselector-options)\n * [.screenshotMultipleSelectors(selectors, callback, options = {})](#screenshotmultipleselectorsselectors-callback-options--)\n * [.screenshotDocument(options = {})](#screenshotdocumentoptions--)\n * [.pdf(options={})](#pdfoptions)\n * [.startScreencast(callback, options = {})](#startscreencastcallback-options--)\n * [.stopScreencast()](#stopscreencast)\n * [.console(func)](#consolefunc)\n * [.receiveMessage(func)](#receivemessagefunc)\n * [.ignoreCertificateErrors()](#ignorecertificateerrors)\n * [.blockUrls(urls)](#blockurlsurls)\n * [.clearBrowserCache()](#clearbrowsercache)\n * [.setCookie(params)](#setcookieparams)\n * [.getCookies(params)](#getcookieparams)\n * [.deleteCookie(name, url = null)](#deletecookiename-url--null)\n * [.clearAllCookies()](#clearallcookies)\n * [.clearDataForOrigin (origin = null, type = 'all')](#cleardatafororigin-origin--null-type--all)\n * [.getDOMCounters()](#getdomcounters)\n * [.static cleanup()](#static-cleanup)\n\n##### Chromy(options)\n\n###### options \n\n - host(default: localhost): host address\n - port(default: 9222): --remote-debugging-port \n - userDataDir(default: null): Chrome profile path. This option can be used to persist an user profile.\n - launchBrowser(default: true): If you want chromy to attach to the Chrome instance that is already launched, set to false.\n - visible(default: false): If set to true, chrome is launched in visible mode. This option is not used if launchBrowser is false.\n - chromePath(default: null): This option is used to find out an executable of Chrome. If set to null, executable is selected automatically. This option is not used if launchBrowser is false.\n - enableExtensions(default: false): Enable extension loading. (Generally, this options is used with userDataDir option)\n - chromeFlags(default: []): These flags is passed to Chrome. Each flag must have a prefix string \"--\". This option is not used if launchBrowser is false.\n - waitTimeout(default: 30000): If wait() doesn't finish in the specified time WaitTimeoutError will be thrown.\n - gotoTimeout(default: 30000): If goto() doesn't finish in the specified time GotoTimeoutError will be thrown.\n - evaluateTimeout(default: 30000): If evaluate() doesn't finish in the specified time EvaluateTimeError will be thrown.\n - waitFunctionPollingInterval(default: 100): polling interval for wait().\n - typeInterval(default: 20): This option is used only in type() method.\n - activateOnStartUp(default: true): activate a first tab on startup. this option is enable only in visible mode.\n\n\n##### .start(startingUrl = null)\n\nLaunches Chrome browser.\n\n###### options\n\nstartingUrl: a staring url. If you set to null 'about:blank' is used as a starting url.\n\n##### .goto(url, options = {})\n\nGoes to url. If you have not called start(), this method calls start(url) automatically.\n\n###### options\n\nwaitLoadEvent(default: true): If set to false, goto() doesn't wait until load event is fired.\n\n###### returns\n\nReturns [Response object](https://chromedevtools.github.io/devtools-protocol/tot/Network/#type-Response)\n\n##### .waitLoadEvent()\n\nwait until a load event is fired.\n\n##### .userAgent(ua)\n\nset a useragent.\n\nua: new user agent.\n\n##### Chromy.addCustomDevice(device)\n\nadd custom device definitions to emulate it.\n\nSee [src](src/devices.js).\n\n##### .emulate(deviceName)\n\nemulate a device that is defined by `Chromy.addCustomDevice()`.\n\n##### .forward()\n\ngo forward to the next page and wait until load event is fired.\n\n##### .back()\n\ngo back to the previous page and wait until load event is fired.\n\n##### .inject(type, file)\n\nInjects a file into browser as a javascript or a css.\n\ntype: must be 'js' or 'css'\nfile: injected file.\n\n##### .evaluate(func|source, args)\n\nEvaluates a expression in the browser context. \nIf the expression returns a Promise object, the promise is resolved automatically.\n\n##### .result(func)\n\nresult() receives a result of previous directive.\n\n```js\nchromy.chain()\n .goto('http://example.com')\n .evaluate(() => {\n return document.querySelectorAll('*').length\n })\n .result((length) => {\n // length is a result of evaluate() directive.\n console.log(length)\n }\n .end()\n```\n\n##### .end()\n\n##### .exists(selector)\n\nReturns whether an node matched with the selector is exists.\n\n##### .visible(selector)\n\nReturns whether an node matched with the selector is exists and visible.\n\n##### .wait(msec)\n\nalias for .sleep(msec)\n\n##### .wait(selector)\n\nwait until selector you specified appear in a DOM tree.\n\n##### .wait(func)\n\nwait until function you supplied is evaluated as true. func() executes in browser window context.\n\n##### .sleep(msec)\n\nwait for milli seconds you specified.\n\n##### .type(selector, text)\n\n##### .insert(selector, text)\n\n##### .check(selector)\n\n##### .uncheck(selector)\n\n##### .select(selector, value)\n\n##### .setFile(selector, files)\n\nSets the files to a file field that matches the selector.\n\n - selector: selector for specifying the file field.\n - files: The array or string value that represents a local file path.\n\n##### .click(selector, options)\n\n###### options\n\nwaitLoadEvent(default: false): If set to true, wait until load event is fired after click event is fired.\n\n##### .mouseMoved(x, y, options = {})\n\nDispatch mousemoved event.\n\n##### .mousePressed(x, y, options = {})\n\nDispatch mousedown event.\n\n##### .mouseReleased(x, y, options = {})\n\nDispatch mouseup event.\n\n##### .tap(x, y, options = {})\n\nSynthesize tap by dispatching touch events.\n(NOTE: To dispatch touch events you need to enable a mobile emulation before.)\n\n##### .doubleTap(x, y, options = {})\n\nSynthesize double tap by dispatching touch events.\n(NOTE: To dispatch touch events you need to enable a mobile emulation before.)\n\n##### .scroll(x, y)\n\nScrolls to the position. x and y means relative position.\n\n##### .scrollTo(x, y)\n\nScrolls to the position. x and y means absolute position.\n\n##### .rect(selector)\n\nReturns a rect of the element specified by selector.\n\n##### .rectAll(selector)\n\nReturns an array of rects that is specified by selector.\n\n##### .defineFunction(func)\n\n```js\nfunction outerFunc () {\n return 'VALUE'\n}\nchromy.chain()\n .goto('http://example.com')\n .defineFunction(outerFunc)\n .evaluate(() => {\n outerFunc()\n })\n .end()\n```\n\n##### .send(eventName, parameter)\n\nCalls DevTools protocol directly.\n\n##### .on(eventName, listener)\n\nAdds the listener function.\n\n##### .once(eventName, listener)\n\nAdds one time listener function.\n\n##### .removeListener(eventName, listener)\n\nRemoves the listener function.\n\n##### .removeAllListeners(eventName)\n\nRemoves all listener function.\n\n##### .screenshot(options= {})\n\nExports a current screen as an image data.\n\nSee examples: [examples/screenshot.js](examples/screenshot.js)\n\n###### options\n\n - format(default: 'png'): must be either 'png' or 'jpeg'\n - quality(default: 100): quality of image.\n - fromSurface(default: true): if set to true, take screenshot from surface.\n - useDeviceResolution(default: false): if set to true, the image will have same resolution with device.\n\n##### .screenshotSelector(selector, options={})\n\nExports an area of selector you specified as an image data.\n\nSee examples: [examples/screenshot.js](examples/screenshot.js)\n\nNote:\n\n - The size of target specified by selector must be smaller than viewport size. If not, image gets cropped.\n - It has a side-effect. After this api is called, scroll position is moved to target position.\n\n###### options\n\nSee screenshot()\n\n##### .screenshotMultipleSelectors(selectors, callback, options = {})\n\nTakes multiple screenshot specified by selector at once.\nEach image can be received by callback.\n\nLimitation:\n - It is impossible that taking a screenshot of the element positioned at below of 16384px because of limitation of chrome.\n Detail: https://groups.google.com/a/chromium.org/d/msg/headless-dev/DqaAEXyzvR0/P9zmTLMvDQAJ\n\n###### Parameter\n\n - selectors: An array of selector\n - callback: function(error, image, index, selectors, subIndex)\n - error: error information.\n - image: image data\n - index: index of selectors.\n - subIndex: this value is used only if useQuerySelecotrAll is true.\n - options: \n - model: see explanation of screenDocument()\n - format: see explanation of screenshot()\n - quality: see explanation of screenshot()\n - fromSurface: see explanation of screenshot()\n - useQuerySelectorAll(default: false): If set to true, take all the screenshot of elements returned from document.querySelectorAll() (Since v 0.2.13)\n\n##### .screenshotDocument(options = {})\n\nExports a entire document as an image data.\n\nSee examples: [examples/screenshot.js](examples/screenshot.js)\n\nLimitation:\n - Cannot take a screenshot of an area under 16384px.\n Detail: https://groups.google.com/a/chromium.org/d/msg/headless-dev/DqaAEXyzvR0/P9zmTLMvDQAJ\n\nKnown Issue:\n\n - When this api is called to take large page sometimes strange white area is appeared. This result is caused by --disable-flag option passed to Chrome. After chrome 60 is officially released I remove --disable-flag option to fix this problem.\n\n###### options\n\n - model: this parameter affect page size. must be which one of: 'box', 'scroll'. 'box' means box model of body element. 'scroll' means size of scroll area.\n - format: see explanation of screenshot()\n - quality: see explanation of screenshot()\n - fromSurface: see explanation of screenshot()\n\n##### .pdf(options={})\n\nExports a current page's printing image as a PDF data.\nThis function is supported only in headless mode (since Chrome60).\n\nSee examples: [examples/screenshot.js](examples/screenshot.js)\n\n###### Parameters\n\n - options: See [devtools protocol](https://chromedevtools.github.io/devtools-protocol/tot/Page/#method-printToPDF)\n\n##### .startScreencast(callback, options = {})\n\nStarts screencast to take screenshots by every frame.\n\nSee examples: [examples/screencast.js](examples/screenshot.js)\n\n###### Parameter\n\ncallback: callback function for receiving parameters of screencastFrame event. See details [here](https://chromedevtools.github.io/devtools-protocol/tot/Page/#event-screencastFrame)\noptions: See details [here](https://chromedevtools.github.io/devtools-protocol/tot/Page/#method-startScreencast).\n\n##### .stopScreencast()\n\nStops screencast.\n\n##### .console(func)\n\n```js\nchromy.chain()\n .goto('http://example.com')\n .console((text) => {\n console.log(text)\n })\n .evaluate(() => {\n console.log('HEY')\n })\n .end()\n```\n\n##### .receiveMessage(func)\n\nreceive a message from browser. \n\nYou can communicate with a browser by using receiveMessage() and sendToChromy().\nsendToChromy() is a special function to communicate with Chromy.\nWhen you call receiveMessage() at the first time, sendToChromy() is defined in a browser automatically.\nA listener function passed to receiveMessage() receives parameters when sendToChromy() is executed in a browser.\n\n\n```js\nchromy.chain()\n .goto('http://example.com')\n .receiveMessage((msg) => {\n console.log(msg[0].value)\n })\n .evaluate(() => {\n sendToChromy({value: 'foo'})\n })\n```\n\n##### .ignoreCertificateErrors()\n\nIgnores all certificate errors.\n\n```js\nchromy.chain()\n .ignoreCertificateErrors()\n .goto('https://xxxxx/')\n .end()\n```\n\n##### .blockUrls(urls)\n\nblocks urls from loading. \n\n###### Parameter\n\nurls: array[string] \nWildcard('*') is allowed in url string.\n\n##### .clearBrowserCache()\n\nRemoves all browser caches.\n\n##### .setCookie(params)\n\n###### Parameters\n\nparams: object or array\n\nSee [chrome document](https://chromedevtools.github.io/devtools-protocol/tot/Network/#method-setCookie)\nIf url parameter is not set, current url(location.href) is used as default value.\n\n##### .getCookies(name = null)\n\n###### Parameters\n\nname: string or array of string\n\nSee [chrome document](https://chromedevtools.github.io/devtools-protocol/tot/Network/#method-getCookies)\n\n##### .deleteCookie(name, url = null)\n\nRemove a cookie.\n\n###### Parameters\n\nname: string or array of string\nurl: url associated with cookie. If url is not set, current url(location.href) is used as default value.\n\n##### .clearAllCookies()\n\nRemoves all browser cookies.\n\n##### .clearDataForOrigin (origin = null, type = 'all')\n\nClear data for origin.(cookies, local_storage, indexedDb, etc...)\n\nSee details [here](https://chromedevtools.github.io/devtools-protocol/tot/Storage/#method-clearDataForOrigin).\n\n##### .getDOMCounters()\n\nGet count of these item: document, node, jsEventListeners\n\nSee details [here](https://chromedevtools.github.io/devtools-protocol/tot/Memory/#method-getDOMCounters).\n\n##### .static cleanup()\n\nclose all browsers.\n\n```js\nprocess.on('SIGINT', async () => {\n await Chromy.cleanup()\n process.exit(1)\n})\n```\n\n## Contributing\n\nBug reports and pull requests are welcome on GitHub at https://github.com/OnetapInc/chromy\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "iconic/illustrator-svg-exporter", "link": "https://github.com/iconic/illustrator-svg-exporter", "tags": [], "stars": 598, "description": "A better way to generate SVGs from Illustrator.", "lang": "JavaScript", "repo_lang": "", "readme": "# Illustrator SVG Exporter\n\nExporting SVGs from Illustrator is a slow, laborious process—this script fixes that. The script doesn't waste your time with GUI or settings you'll never use. You just run the script, select a location to export and you have your SVGs. We love the concept behind [Generator](http://blogs.adobe.com/photoshopdotcom/2013/09/introducing-adobe-generator-for-photoshop-cc.html) and this script takes a strong cue from it. The script exports any layer, group or path named with the `.svg` extension. We use this script to export all our icons for [Open Iconic](https://github.com/iconic/open-iconic).\n\n## Installation\n\nYou don't _have_ to install the script to use it (more on that later), but installing the script is by far the best way to use it. All you need to do is drop the `SVG Exporter.jsx` file in one of the following directories:\n\n* Windows: `C:\\Program Files\\Adobe\\Adobe lllustratorCC2014\\Presets\\[language]\\Scripts\\`\n* Mac OS: `/Applications/Adobe lllustrator CC 2014/Presets/[language]/Scripts/`\n\nNote: Make sure to restart Illustrator if you installed the script while the Application is running.\n\n## Running the Script\n\nOnce the script is installed, you'll be able to run it by going to `File > Scripts > SVG Exporter`. As mentioned, you don't need to install the script. If you want to run it as a one-off, select `File > Scripts > Other Script...` and select the `SVG Exporter.jsx` file in the file chooser.\n\nOnce you run the script, you'll be prompted to select a location to save the SVG files. After a location is set, you're done—the script does the rest.\n\n## Document Setup\n\nThe script doesn't force any setup or organization on you. You can export layers, groups, compound paths or individual paths. Just name the path/layer/group/compound path what you want the file name to be (e.g., my-cool-vector-drawing.svg) and the script will prep it for export. You can export nested layers (example: export indiviual assets as well all assets in a parent layer). The exported SVGs will be cropped to the bounding box of the path/group/layer.\n\nYou can name artboards with a `.svg` extension to export SVGs to specific dimensions other than the paths' bounding box. All paths within the artboard will be exported, so make sure to clean up any unwanted paths before export.\n\nIf you want to individually name each element in your SVG for CSS styling (ala [Iconic](http://useiconic.com)), just name each path within a layer or group you wish to have exported. The script will santize the name so that it will be converted to a pretty ID by Illustrator's SVG export engine. _**Hint:** We've also made a slick [Grunt tool](https://github.com/iconic/grunt-svg-toolkit) which (among other things) will convert the IDs from the Illustrator-exported SVG to classes._\n\nIf you previously named an element for export but now don't want to export for some reason, simply lock it to keep it from being exported.\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "enjalot/tributary", "link": "https://github.com/enjalot/tributary", "tags": [], "stars": 598, "description": "rapid prototyping with d3.js", "lang": "JavaScript", "repo_lang": "", "readme": "# Tributary\n[![Gitter](https://badges.gitter.im/Join Chat.svg)](https://gitter.im/enjalot/tributary?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge) \n\nTributary allows you to share live-editable code snippets. These snippets will\nmost likely use d3.js and allow a user to play with code in a very responsive\nway.\n\nTributary is innovation on principle, taking the excellent work of Gabriel\nFlorit which was in turn inspired by Bret Victor's genius and making it sharable.\n\nTributary is a labor of love by Ian '@enjalot' Johnson and EJ '@mrejfox' Fox.\n\n# Usage:\nStart typing into the code editor and watch your code come to life!\nIf you want to save your work you can click the fork button and it will save into a gist.\nThe url in your browser will update to something like:\nhttp://tributary.io/inlet/2958568\n\nwhere the number: 2958568 is the gist id\n(you can see it here: https://gist.github.com/2958568 )\n\n\n# Development:\n\nOn the backend tributary only depends on node and mongodb:\nTo deploy locally run\n```\ngit clone https://github.com/enjalot/tributary\ncd ./tributary\nnpm install\nnode server.js\n```\n\nIf you want to have github authentication working you will need to setup a\ngithub app ( https://github.com/settings/applications ) and fill out the settings.js (see example-settings.js)\nThe github app should have the following settings: \nfull URL: http://localhost:8888 \ncallback URL: http://localhost:8888/github-authenticated \n\nRight now you will also need to setup an imgur app and set the authentication details in settings.js as well\n\n\nFrontend JS src file compilation with make to static requires node.js, uglify-js and browserify\n```\nnpm install\n```\n\nYou need to compile the frontend code and templates using make:\n```\nmake\n```\nYou can check the Makefile to see how it's done with uglify and handlebars.\nthere is also a watch.sh bash script which will recompile the frontend code\nwhen any files change. \n\nTo run the server you need to modify your /etc/hosts file and add\n```\n127.0.0.1 sandbox.localhost\n```\nthis is because tributary uses a separate subdomain to execute unsafe code in an iframe.\n\n\nSome 3rd party libraries are minified and catted together for convenience. The\nresult is found in /static/3rdparty.js\nTo see what those are and how they are bundled look at this repository:\nhttp://github.com/enjalot/3rdparty\n\n\n\nReserved properties of the tributary object:\ntributary.initialize \ntributary.init \ntributary.run \ntributary.g \ntributary.ctx \ntributary.t \ntributary.dt \ntributary.loop \ntributary.loop_type \ntributary.autoinit \ntributary.pause \ntributary.bv \ntributary.nclones \ntributary.clone_opacity \ntributary.duration \ntributary.ease \ntributary.reverse \ntributary.render \n\n\n# Usage as a node module\n\n\n\n### Contexts\n\nI'm using latest CodeMirror from git (updating every-so often) \nI have customized the JSHINT options in addons/lint/javascript-lint.js to be:\n```\n{\n asi: true,\n laxcomma: true,\n laxbreak: true,\n loopfunc: true,\n smarttabs: true,\n sub: true\n}\n```\n\n\n\n\n### TODO: \n\n#### Editor UI: \n+ re-enable vim and emacs mode (add ui for those selections somewhere) \n+ re-enable local storage backups per editor (need it so you can load code back but not execute it) \n\n#### File UI: \n+ open file from disk (file dialog) \n+ edit filename \n+ delete files \n\n+ Embedding example (simpler UI, assemble from fewer pieces) \n\n+ Make BV button work for any of the renders (not just svg) \n\n#### Contexts\n\n+ enable number scrubbing for text mode (csv and tsv files)\n\n\n", "readme_type": "markdown", "hn_comments": "I wrote a little post about why I'm so excited about Tributary here: http://ejfox.tumblr.com/post/22615208842What I think is really cool is how many applications it could potentially have, from education to performance.Really exciting, and a great educational tool! Thanks for your hard work.my blog is straining under load >_", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "ehmicky/wild-wild-path", "link": "https://github.com/ehmicky/wild-wild-path", "tags": ["nodejs", "javascript", "map", "json", "library", "algorithm", "typescript", "parsing", "functional-programming", "regex", "regexp", "filter", "glob", "regular-expression", "path", "recursion", "data-structures", "wildcard", "globbing", "globstar"], "stars": 598, "description": "\ud83e\udd20 Object property paths with wildcards and regexps \ud83c\udf35", "lang": "JavaScript", "repo_lang": "", "readme": " \n\n[![Node](https://img.shields.io/badge/-Node.js-808080?logo=node.js&colorA=404040&logoColor=66cc33)](https://www.npmjs.com/package/wild-wild-path)\n[![Browsers](https://img.shields.io/badge/-Browsers-808080?logo=firefox&colorA=404040)](https://unpkg.com/wild-wild-path?module)\n[![TypeScript](https://img.shields.io/badge/-Typed-808080?logo=typescript&colorA=404040&logoColor=0096ff)](/src/main.d.ts)\n[![Codecov](https://img.shields.io/badge/-Tested%20100%25-808080?logo=codecov&colorA=404040)](https://codecov.io/gh/ehmicky/wild-wild-path)\n[![Minified size](https://img.shields.io/bundlephobia/minzip/wild-wild-path?label&colorA=404040&colorB=808080&logo=webpack)](https://bundlephobia.com/package/wild-wild-path)\n[![Mastodon](https://img.shields.io/badge/-Mastodon-808080.svg?logo=mastodon&colorA=404040&logoColor=9590F9)](https://fosstodon.org/@ehmicky)\n[![Medium](https://img.shields.io/badge/-Medium-808080.svg?logo=medium&colorA=404040)](https://medium.com/@ehmicky)\n\n\ud83e\udd20 Object property paths with wildcards and regexps. \ud83c\udf35\n\nGet/set object properties using:\n\n- \u26cf\ufe0f [Dot-delimited paths](#%EF%B8%8F-deep-properties): `foo.bar.0.baz`\n- \u2b50 [Wildcards](#-wildcards): `foo.*`, `**.bar`\n- \ud83d\uddfa\ufe0f [Regexps](#%EF%B8%8F-regexps): `foo./ba?/`\n- \ud83c\udfdc\ufe0f [Slices](#%EF%B8%8F-array-slices): `foo.0:2`\n- \ud83d\ude82 [Unions](#-unions): `foo bar baz`\n\n# Hire me\n\nPlease\n[reach out](https://www.linkedin.com/feed/update/urn:li:activity:7018596298127781890/)\nif you're looking for a Node.js API or CLI engineer (10 years of experience).\nMost recently I have been [Netlify Build](https://github.com/netlify/build)'s\nand [Netlify Plugins](https://www.netlify.com/products/build/plugins/)'\ntechnical lead for 2.5 years. I am available for full-time remote positions in\neither US or EU time zones.\n\n# Install\n\n```bash\nnpm install wild-wild-path\n```\n\nThis package works in both Node.js >=14.18.0 and\n[browsers](https://raw.githubusercontent.com/ehmicky/dev-tasks/main/src/browserslist).\n\nThis is an ES module. It must be loaded using\n[an `import` or `import()` statement](https://gist.github.com/sindresorhus/a39789f98801d908bbc7ff3ecc99d99c),\nnot `require()`. If TypeScript is used, it must be configured to\n[output ES modules](https://www.typescriptlang.org/docs/handbook/esm-node.html),\nnot CommonJS.\n\n# API\n\n## Methods\n\n### get(target, query, options?)\n\n`target`: [`Target`](#target)\\\n`query`: [`Query`](#queries)\\\n`options`: [`Options?`](#options)\\\n_Return value_: `any | undefined`\n\nReturn the first property matching the `query`.\n\n```js\nconst target = { settings: { colors: ['red', 'blue'] } }\n\nget(target, 'settings.colors.0') // 'red'\nget(target, ['settings', 'colors', 0]) // 'red'\n```\n\n### has(target, query, options?)\n\n`target`: [`Target`](#target)\\\n`query`: [`Query`](#queries)\\\n`options`: [`Options?`](#options)\\\n_Return value_: `boolean`\n\nReturn whether the `query` matches any property.\n\n```js\nconst target = { settings: { lastName: undefined, colors: ['red', 'blue'] } }\n\nhas(target, 'settings.firstName') // false\nhas(target, ['settings', 'firstName']) // false\nhas(target, 'settings.lastName') // true\n```\n\n### list(target, query, options?)\n\n`target`: [`Target`](#target)\\\n`query`: [`Query`](#queries)\\\n`options`: [`Options?`](#options)\\\n_Return value_: `any[]`\n\nReturn all properties matching the `query`, as an array.\n\n\n\n```js\nconst target = {\n userOne: { firstName: 'John', lastName: 'Doe', age: 72 },\n userTwo: { firstName: 'Alice', colors: ['red', 'blue', 'yellow'] },\n}\n\nlist(target, 'userOne.firstName userTwo.colors.0') // ['John', 'red']\nlist(target, [\n ['userOne', 'firstName'],\n ['userTwo', 'colors', 0],\n]) // ['John', 'red']\n\nlist(target, 'userOne./Name/') // ['John', 'Doe']\nlist(target, ['userOne', /Name/]) // ['John', 'Doe']\n\nlist(target, 'userTwo.colors.*') // ['red', 'blue', 'yellow']\nlist(target, 'userTwo.colors.0:2') // ['red', 'blue']\nlist(target, '**.firstName') // ['John', 'Alice']\nlist(target, 'userOne.*', { entries: true })\n// [\n// { value: 'John', path: ['userOne', 'firstName'], missing: false },\n// { value: 'Doe', path: ['userOne', 'lastName'], missing: false },\n// { value: 72, path: ['userOne', 'age'], missing: false },\n// ]\n```\n\n### iterate(target, query, options?)\n\n`target`: [`Target`](#target)\\\n`query`: [`Query`](#queries)\\\n`options`: [`Options?`](#options)\\\n_Return value_: [`Iterable`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Iteration_protocols#examples_using_the_iteration_protocols)\n\nReturn all properties matching the `query`, as an\n[iterable](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Iteration_protocols#examples_using_the_iteration_protocols).\nThis is slower than [`list()`](#listtarget-query-options) but uses less memory.\n\n\n\n```js\nconst target = { settings: { colors: ['red', 'blue'] } }\n\nfor (const color of iterate(target, 'settings.colors.*')) {\n console.log(color) // 'red', 'blue'\n}\n```\n\n### set(target, query, value, options?)\n\n`target`: [`Target`](#target)\\\n`query`: [`Query`](#queries)\\\n`value`: `any`\\\n`options`: [`Options?`](#options)\\\n_Return value_: `Target`\n\nSets all properties matching the `query`. The return value is a deep clone\nunless the [`mutate`](#mutate) option is `true`.\n\n```js\nconst target = { colors: ['red', 'blue'] }\n\nset(target, 'colors.0', 'yellow') // ['yellow', 'blue']\nset(target, ['colors', 0], 'yellow') // ['yellow', 'blue']\nset(target, 'colors.-1', 'yellow') // ['red', 'yellow']\nset(target, 'colors.-0', 'yellow') // ['red', 'blue', 'yellow']\nset(target, 'colors.*', 'yellow') // ['yellow', 'yellow']\nset({}, 'user.0.color', 'red') // { user: [{ color: 'red' }] }\nset({}, 'user.0.color', 'red', { missing: false }) // {}\n```\n\n### remove(target, query, options?)\n\n`target`: [`Target`](#target)\\\n`query`: [`Query`](#queries)\\\n`options`: [`Options?`](#options)\\\n_Return value_: `Target`\n\nDelete all properties matching the `query`. The return value is a deep clone\nunless the [`mutate`](#mutate) option is `true`.\n\n\n\n```js\nconst target = { user: { firstName: 'John', lastName: 'Doe', age: 72 } }\n\nremove(target, 'user.lastName') // { user: { firstName: 'John', age: 72 } }\nremove(target, 'user./Name/') // { user: { age: 72 } }\nremove(target, ['user', /Name/]) // { user: { age: 72 } }\n```\n\n## Functional utilities\n\n[`wild-wild-utils`](https://github.com/ehmicky/wild-wild-utils) is a separate\nlibrary which provides with additional, higher-level methods:\n[`map()`](https://github.com/ehmicky/wild-wild-utils#maptarget-query-mapfunction-options),\n[`merge()`](https://github.com/ehmicky/wild-wild-utils#mergetarget-query-value-options),\n[`push()`](https://github.com/ehmicky/wild-wild-utils#pushtarget-query-values-options),\n[`unshift()`](https://github.com/ehmicky/wild-wild-utils#unshifttarget-query-values-options),\n[`find()`](https://github.com/ehmicky/wild-wild-utils#findtarget-query-testfunction-options),\n[`pick()`](https://github.com/ehmicky/wild-wild-utils#picktarget-query-options),\n[`include()`](https://github.com/ehmicky/wild-wild-utils#includetarget-query-testfunction-options),\n[`exclude()`](https://github.com/ehmicky/wild-wild-utils#excludetarget-query-testfunction-options),\n[`flatten()`](https://github.com/ehmicky/wild-wild-utils#flattentarget-options).\n\n## Target\n\nThe target value must be an object or an array.\n\n## Queries\n\nThere are two equivalent formats for queries: strings and arrays.\n\n- Query [strings](#query-strings) are friendlier to CLI usage, more expressive,\n and easier to serialize.\n- Query [arrays](#query-arrays) are friendlier to programmatic usage, and\n faster. Also, they do not require escaping, so they should be used when the\n input is dynamic or user-provided to prevent injection attacks.\n\n### Query strings\n\n#### \u26cf\ufe0f Deep properties\n\n```bash\n# Deep properties of objects or arrays.\n# Dots are used for array indices, not brackets.\n# Symbol properties are always ignored.\nuser.colors.0\n```\n\n#### \ud83d\ude82 Unions\n\n```bash\n# Unions (\"or\") of queries are space-delimited.\n# The string must not be empty.\ncolors name age\n```\n\n#### \u2b50 Wildcards\n\n```bash\n# Shallow wildcards target all properties/items of a single object/array\nuser.*\n\n# Deep wildcards target all properties/items of 0, 1 or many objects/arrays\nuser.**\n**.colors\n```\n\n#### \ud83d\uddfa\ufe0f Regexps\n\n```bash\n# Regexps match property names\nuser./name/\n\n# Flags can be used, e.g. to make it case-insensitive\nuser./name/i\n\n# ^ $ must be used to match from the beginning or until the end\nuser./^name$/i\n```\n\n#### \ud83c\udf35 Arrays indices\n\n```bash\n# Array indices are integers\nuser.colors.0\n\n# Array indices can be negative.\n# -1 is the last item.\n# -0 is the item after it, which can be used to append.\nuser.colors.-1\n```\n\n#### \ud83c\udfdc\ufe0f Array slices\n\n```bash\n# Array slices. Goes from the start (included) to the end index (excluded).\nuser.colors.0:2\n\n# The start index defaults to 0, i.e. the beginning\nuser.colors.:2\n\n# The end index defaults to -0, i.e. the end\nuser.colors.0:\nuser.colors.:\n```\n\n#### \ud83e\udea8 Escaping\n\n```bash\n# Dots, spaces and backslashes in property names must be escaped\nname\\\\ with\\\\ spaces\nname\\\\.with\\\\.dots\nname\\\\\\\\with\\\\\\\\backslashes\n\n# Ambiguous property names must be escaped with a backslash at the beginning.\n# This includes properties that:\n# - Are integers but are not array elements\n# - Have multiple slashes and start with one\nname.\\\\0\nname.\\\\/not_a_regexp/\n```\n\n#### \ud83c\udfe8 Root and empty strings\n\n```bash\n# A leading dot can optionally be used. It is ignored.\nuser.colors\n.user.colors\n\n# Root value\n.\n\n# Empty string properties\nuser..colors\n```\n\n### Query arrays\n\n#### \u26cf\ufe0f Deep properties\n\n\n```es6\n// Deep properties of objects or arrays.\n// Symbol properties are always ignored.\n['user', 'colors', 0]\n```\n\n#### \ud83d\ude82 Unions\n\n\n```es6\n// Unions (\"or\") of queries are arrays of arrays.\n// There must be at least one item.\n[['colors'], ['name'], ['age']]\n```\n\n#### \u2b50 Wildcards\n\n\n```es6\n// Shallow wildcards target all properties/items of a single object/array\n['user', { type: 'any' }]\n\n// Deep wildcards target all properties/items of 0, 1 or many objects/arrays\n['user', { type: 'anyDeep' }]\n[{ type: 'anyDeep' }, 'colors']\n```\n\n#### \ud83e\udd20 Regexps\n\n\n```es6\n// Regexps match property names\n['user', /name/]\n\n// Flags can be used, e.g. to make it case-insensitive\n['user', /name/i]\n\n// ^ $ must be used to match from the beginning or until the end\n['user', /^name$/i]\n```\n\n#### \ud83c\udf35 Arrays indices\n\n\n```es6\n// Array indices are integers, not strings\n['user', 'colors', 0]\n\n// Array indices can be negative.\n// -1 is the last item.\n// -0 is the item after it, which can be used to append.\n['user', 'colors', -1]\n```\n\n#### \ud83c\udfdc\ufe0f Array slices\n\n\n```es6\n// Array slices. Goes from the start (included) to the end index (excluded).\n['user', 'colors', { type: 'slice', from: 0, to: 2 }]\n\n// The start index defaults to 0, i.e. the beginning\n['user', 'colors', { type: 'slice', to: 2 }]\n\n// The end index defaults to -0, i.e. the end\n['user', 'colors', { type: 'slice', from: 0 }]\n['user', 'colors', { type: 'slice' }]\n```\n\n#### \ud83e\udea8 Escaping\n\n\n```es6\n// Escaping is not necessary with query arrays\n['name with spaces']\n['name.with.dots']\n['name\\\\with\\\\backslashes']\n['name', '0']\n['name', '/not_a_regexp/']\n```\n\n#### \ud83c\udfe8 Root and empty strings\n\n\n```es6\n// Root value\n[]\n\n// Empty string properties\n['user', '', 'colors']\n```\n\n### Paths\n\nA \"path\" is any [query](#queries) using only\n[property names](#%EF%B8%8F-deep-properties) and positive\n[array indices](#-arrays-indices). This excludes\n[negative indices](#-arrays-indices), [slices](#%EF%B8%8F-array-slices),\n[wildcards](#-wildcards), [regexps](#%EF%B8%8F-regexps) and [unions](#-unions).\n\nPaths are returned by the [`entries`](#entries) option.\n\n```bash\n# Path string\nuser.colors.0\n```\n\n\n```es6\n// Path array\n['user', 'colors', 0]\n```\n\n### Conversions and comparisons\n\n[`wild-wild-parser`](https://github.com/ehmicky/wild-wild-parser) can be used to\nconvert between both formats, or to compare queries.\n\n### Undefined values\n\nObject properties with a defined key but an `undefined` value are not ignored.\nHowever, object properties without any defined key are ignored. The\n[`has()`](#hastarget-query-options) method, [`missing`](#missing) option and\n[`entries`](#entries) option can be used to distinguish those.\n\n```js\nconst target = { name: undefined }\n\nhas(target, 'name') // true\nhas(target, 'colors') // false\n\nget(target, 'name') // undefined\nget(target, 'colors') // undefined\nget(target, 'name', { entries: true, missing: true })\n// { value: undefined, path: ['name'], missing: false }\nget(target, 'colors', { entries: true, missing: true })\n// { value: undefined, path: ['colors'], missing: true }\n\nlist(target, '*') // [undefined]\nlist(target, '*', { entries: true })\n// [{ value: undefined, path: ['name'], missing: false }]\n```\n\n## Options\n\nOptions are optional plain objects.\n\n### mutate\n\n_Methods_: [`set()`](#settarget-query-value-options),\n[`remove()`](#removetarget-query-options)\\\n_Type_: `boolean`\\\n_Default_: `false`\n\nBy default, the [target](#target) is deeply cloned.\\\nWhen `true`, it is directly mutated instead, which is faster but has side effects.\n\n```js\nconst target = {}\nconsole.log(set(target, 'name', 'Alice')) // { name: 'Alice' }\nconsole.log(target) // {}\nconsole.log(set(target, 'name', 'Alice', { mutate: true })) // { name: 'Alice' }\nconsole.log(target) // { name: 'Alice' }\n```\n\n### entries\n\n_Methods_: [`get()`](#gettarget-query-options),\n[`list()`](#listtarget-query-options),\n[`iterate()`](#iteratetarget-query-options)\\\n_Type_: `boolean`\\\n_Default_: `false`\n\nBy default, properties' values are returned.\\\nWhen `true`, objects with the following shape are returned instead:\n\n- `value` `any`: property's value\n- `path` [`Path`](#paths): property's full path\n- `missing` `boolean`: whether the property is [missing](#missing) from the\n [target](#target)\n\n```js\nconst target = { firstName: 'Alice', lastName: 'Smith' }\nlist(target, '*') // ['Alice', 'Smith']\nlist(target, '*', { entries: true })\n// [\n// { value: 'Alice', path: ['firstName'], missing: false },\n// { value: 'Smith', path: ['lastName'], missing: false },\n// ]\n```\n\n### missing\n\n_Methods_: all except [`has()`](#hastarget-query-options) and\n[`remove()`](#removetarget-query-options)\\\n_Type_: `boolean`\\\n_Default_: `false` with `list|iterate()`, `true` with `set()`\n\nWhen `false`, properties [not defined in the target](#undefined-values) are\nignored.\n\n```js\nconst target = {}\n\nset(target, 'name', 'Alice') // { name: 'Alice' }\nset(target, 'name', 'Alice', { missing: false }) // {}\n\nlist(target, 'name') // []\nlist(target, 'name', { missing: true, entries: true })\n// [{ value: undefined, path: ['name'], missing: true }]\n```\n\n### sort\n\n_Methods_: [`get()`](#gettarget-query-options),\n[`list()`](#listtarget-query-options),\n[`iterate()`](#iteratetarget-query-options)\\\n_Type_: `boolean`\\\n_Default_: `false`\n\nWhen returning sibling object properties, sort them by the lexigographic order\nof their names (not values).\n\n```js\nconst target = { lastName: 'Doe', firstName: 'John' }\nlist(target, '*') // ['Doe', 'John']\nlist(target, '*', { sort: true }) // ['John', 'Doe']\n```\n\n### childFirst\n\n_Methods_: [`get()`](#gettarget-query-options),\n[`list()`](#listtarget-query-options),\n[`iterate()`](#iteratetarget-query-options)\\\n_Type_: `boolean`\\\n_Default_: `false`\n\nWhen using [unions](#-unions) or [deep wildcards](#-wildcards), a query might\nmatch both a property and some of its children.\n\nThis option decides whether the returned properties should be sorted from\nchildren to parents, or the reverse.\n\n```js\nconst target = { user: { name: 'Alice' } }\nlist(target, 'user.**') // [{ name: 'Alice' }, 'Alice']\nlist(target, 'user.**', { childFirst: true }) // ['Alice', { name: 'Alice' }]\n```\n\n### leaves\n\n_Methods_: all except [`has()`](#hastarget-query-options)\\\n_Type_: `boolean`\\\n_Default_: `false`\n\nWhen using [unions](#-unions) or [deep wildcards](#-wildcards), a query might\nmatch both a property and some of its children.\n\nWhen `true`, only leaves are matched. In other words, a matching property is\nignored if one of its children also matches.\n\n```js\nconst target = { user: { name: 'Alice' } }\nlist(target, 'user.**') // [{ name: 'Alice' }, 'Alice']\nlist(target, 'user.**', { leaves: true }) // ['Alice']\n```\n\n### roots\n\n_Methods_: [`get()`](#gettarget-query-options),\n[`list()`](#listtarget-query-options),\n[`iterate()`](#iteratetarget-query-options)\\\n_Type_: `boolean`\\\n_Default_: `false`\n\nWhen using [unions](#-unions) or [deep wildcards](#-wildcards), a query might\nmatch both a property and some of its children.\n\nWhen `true`, only roots are matched. In other words, a matching property is\nignored if one of its parents also matches.\n\n```js\nconst target = { user: { name: 'Alice' } }\nlist(target, 'user.**') // [{ name: 'Alice' }, 'Alice']\nlist(target, 'user.**', { roots: true }) // [{ name: 'Alice' }]\n```\n\n### shallowArrays\n\n_Methods_: all\\\n_Type_: `boolean`\\\n_Default_: `false`\n\nIf `true`, [wildcards](#-wildcards) do not recurse on arrays. Array items can\nstill be matched by using [indices](#-arrays-indices) or\n[slices](#%EF%B8%8F-array-slices).\n\n```js\nconst target = [{ name: 'Alice' }, { name: 'Bob' }]\nlist(target, '**')\n// [\n// [{ name: 'Alice' }, { name: 'Bob' }],\n// { name: 'Alice' },\n// 'Alice',\n// { name: 'Bob' },\n// 'Bob',\n// ]\nlist(target, '**', { shallowArrays: true })\n// [\n// [{ name: 'Alice' }, { name: 'Bob' }],\n// ]\n```\n\n### classes\n\n_Methods_: all\\\n_Type_: `boolean`\\\n_Default_: `false`\n\nUnless `true`, [wildcards](#-wildcards) and [regexps](#%EF%B8%8F-regexps) ignore\nproperties of objects that are not plain objects (like class instances, errors\nor functions). Those can still be matched by using their\n[property name](#%EF%B8%8F-deep-properties).\n\n```js\nconst target = { user: new User({ name: 'Alice' }) }\nlist(target, 'user.*') // []\nlist(target, 'user.*', { classes: true }) // ['Alice']\n```\n\n### inherited\n\n_Methods_: all\\\n_Type_: `boolean`\\\n_Default_: `false`\n\nBy default, [wildcards](#-wildcards) and [regexps](#%EF%B8%8F-regexps) ignore\nproperties that are either\n[inherited](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Inheritance_and_the_prototype_chain)\nor\n[not enumerable](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Enumerability_and_ownership_of_properties).\nThose can still be matched by using their\n[property name](#%EF%B8%8F-deep-properties).\n\nWhen `true`, inherited properties are not ignored, but not enumerable ones still\nare.\n\n# Related projects\n\n- [`wild-wild-utils`](https://github.com/ehmicky/wild-wild-utils): functional\n utilities using `wild-wild-path`'s object property paths\n- [`wild-wild-parser`](https://github.com/ehmicky/wild-wild-parser): parser for\n `wild-wild-path`'s object property paths\n\n# Support\n\nFor any question, _don't hesitate_ to [submit an issue on GitHub](../../issues).\n\nEveryone is welcome regardless of personal background. We enforce a\n[Code of conduct](CODE_OF_CONDUCT.md) in order to promote a positive and\ninclusive environment.\n\n# Contributing\n\nThis project was made with \u2764\ufe0f. The simplest way to give back is by starring and\nsharing it online.\n\nIf the documentation is unclear or has a typo, please click on the page's `Edit`\nbutton (pencil icon) and suggest a correction.\n\nIf you would like to help us fix a bug or add a new feature, please check our\n[guidelines](CONTRIBUTING.md). Pull requests are welcome!\n\n\n\n\n\n\n\n\n\n\n\n\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "Hibear/verify", "link": "https://github.com/Hibear/verify", "tags": [], "stars": 598, "description": "\u5e38\u89c4\u9a8c\u8bc1\u7801\u3001\u6ed1\u52a8\u9a8c\u8bc1\u7801\u3001\u62fc\u56fe\u9a8c\u8bc1\u7801\u3001\u9009\u5b57\u9a8c\u8bc1\u7801\uff0c\u7eaf\u524d\u7aef\u9a8c\u8bc1\u7801\u3002", "lang": "JavaScript", "repo_lang": "", "readme": "# verify\nRegular verification code, sliding verification code, puzzle verification code, word selection verification code, pure front-end verification code.\n\nofficial website ", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "bitpay/bitcore-lib", "link": "https://github.com/bitpay/bitcore-lib", "tags": [], "stars": 597, "description": "A pure and powerful JavaScript Bitcoin library", "lang": "JavaScript", "repo_lang": "", "readme": "Bitcore Library\n=======\nTHIS REPO HAVE BEEN MOVED TO BITCORE's MONO REPO. Check: \n\nhttps://github.com/bitpay/bitcore/tree/v8.0.0/packages/bitcore-lib\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "fireyy/react-antd-admin", "link": "https://github.com/fireyy/react-antd-admin", "tags": ["react", "antd", "admin-dashboard", "admin-ui", "admin-theme"], "stars": 597, "description": "This Project Is Deprecated. Use [Ant Design Pro](https://pro.ant.design/) instead.", "lang": "JavaScript", "repo_lang": "", "readme": ">## This Project Is Deprecated. Use [Ant Design Pro](https://pro.ant.design) instead.\n\n>[Ant Design Pro](https://pro.ant.design) is a production-ready solution for admin interfaces. Built on the design principles developed by Ant Design, this project introduces higher level components; we have developed templates, components, and a corresponding design kit to improve the user and development experience for admin interfaces.\n\n## React Ant.Design Admin UI\n\n\n \n \n \n Live Demo\n \n
\n\n## Features\n\n- [React](https://facebook.github.io/react/)\n- [Redux](https://github.com/reactjs/redux)\n- [Ant.Design](http://ant.design/)\n- [Babel](https://babeljs.io/)\n- [webpack](https://webpack.github.io/)\n- [mocha](https://mochajs.org/)\n- [enzyme](https://github.com/airbnb/enzyme)\n\n## Getting Started\n\nJust clone the repo and install the necessary node modules:\n\n```shell\n$ git clone https://github.com/fireyy/react-antd-admin\n$ cd react-antd-admin\n$ npm install\n```\n\n## Run Dev\n\n```shell\n$ npm run dev\n```\n\n## Run test spec\n\n```shell\n$ npm run test\n```\n\n## Build\n\n```shell\n$ npm run build\n```\n\n## Changelog\n\n### 0.2.0\n\n* \u66f4\u65b0 React \u5230 15.6.x\n* \u66f4\u65b0 webpack \u5230 2.x\n\n### 0.1.2\n\n* \u66f4\u65b0\u4f9d\u8d56\u7ec4\u4ef6\u5230\u6700\u65b0\u7248\u672c\n* \u589e\u52a0 page2 demo\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "angular-ui/ui-select2", "link": "https://github.com/angular-ui/ui-select2", "tags": [], "stars": 597, "description": "AngularJS wrapper for select2 (deprecated, use angular-ui/ui-select)", "lang": "JavaScript", "repo_lang": "", "readme": "ui-select2 (deprecated) [![Build Status](https://travis-ci.org/angular-ui/ui-select2.png)](https://travis-ci.org/angular-ui/ui-select2)\n========================\n\n# Annoucement\n\n\nThis **directive is now obsolete**. A new initiative, more active, and 100% angular is available at https://github.com/angular-ui/ui-select.\n\nAs development slowed down on ui-select2, it is getting unlikely that bugs will be fixed. So the new alternative should be used as soon as possible.\n\n# Description\n\nThis directive allows you to enhance your select elements with behaviour from the [select2](http://ivaynberg.github.io/select2/) library.\n\n# Requirements\n\n- [AngularJS](http://angularjs.org/)\n- [JQuery](http://jquery.com/)\n- [Select2](http://ivaynberg.github.io/select2/)\n\n## Setup\n\n1. Install **Karma**, **Grunt** and **Bower**\n `$ npm install -g karma grunt-cli bower`\n2. Install development dependencies\n `$ npm install`\n3. Install components\n `$ bower install`\n4. ???\n5. Profit!\n\n## Testing\n\nWe use [Grunt](http://gruntjs.com/) to check for JavaScript syntax errors and execute all unit tests. To run Grunt, simply execute:\n\n`$ grunt`\n\nThis will lint and test the code, then exit. To have Grunt stay open and automatically lint and test your files whenever you make a code change, use:\n\n`$ grunt karma:server watch`\n\nThis will start a Karma server in the background and run unit tests in Firefox and PhantomJS whenever the source code or spec file is saved.\n\n# Usage\n\nWe use [bower](https://github.com/bower/bower) for dependency management. Install AngularUI Select2 into your project by running the command\n\n`$ bower install angular-ui-select2`\n\nIf you use a `bower.json` file in your project, you can have Bower save ui-select2 as a dependency by passing the `--save` or `--save-dev` flag with the above command.\n\nThis will copy the ui-select2 files into your `bower_components` folder, along with its dependencies. Load the script files in your application:\n```html\n \n\n\n\n\n```\n\n(Note that `jquery` must be loaded before `angular` so that it doesn't use `jqLite` internally)\n\n\nAdd the select2 module as a dependency to your application module:\n\n```javascript\nvar myAppModule = angular.module('MyApp', ['ui.select2']);\n```\n\nApply the directive to your form elements:\n\n```html\n\n \n First \n Second \n Third \n \n```\n\n## Options\n\nAll the select2 options can be passed through the directive. You can read more about the supported list of options and what they do on the [Select2 Documentation Page](http://ivaynberg.github.com/select2/)\n\n```javascript\nmyAppModule.controller('MyController', function($scope) {\n $scope.select2Options = {\n allowClear:true\n };\n});\n```\n\n```html\n\n First \n Second \n Third \n \n```\n\nSome times it may make sense to specify the options in the template file.\n\n```html\n\n First \n Second \n Third \n \n```\n\nTo define global defaults, you can configure the `uiSelect2Config` injectable:\n\n```javascript\nmyAppModule.run(['uiSelect2Config', function(uiSelect2Config) {\n\tuiSelect2Config.placeholder = \"Placeholder text\";\n}]);\n```\n\n## Working with ng-model\n\nThe ui-select2 directive plays nicely with ng-model and validation directives such as ng-required.\n\nIf you add the ng-model directive to same the element as ui-select2 then the picked option is automatically synchronized with the model value.\n\n## Working with dynamic options\n`ui-select2` is incompatible with ``. For the best results use `` instead.\n```html\n\n \n {{number.text}} \n \n```\n\n## Working with placeholder text\nIn order to properly support the Select2 placeholder, create an empty ` ` tag at the top of the `` and either set a `data-placeholder` on the select element or pass a `placeholder` option to Select2.\n```html\n\n \n First \n Second \n Third \n \n```\n\n## ng-required directive\n\nIf you apply the required directive to element then the form element is invalid until an option is selected.\n\nNote: Remember that the ng-required directive must be explicitly set, i.e. to \"true\". This is especially true on divs:\n\n```html\n\n \n First \n Second \n Third \n \n```\n\n## Using simple tagging mode\n\nWhen AngularJS View-Model tags are stored as a list of strings, setting\nthe ui-select2 specific option `simple_tags` will allow to keep the model\nas a list of strings, and not convert it into a list of Select2 tag objects.\n\n```html\n \n```\n\n```javascript\nmyAppModule.controller('MyController', function($scope) {\n $scope.list_of_string = ['tag1', 'tag2']\n $scope.select2Options = {\n 'multiple': true,\n 'simple_tags': true,\n 'tags': ['tag1', 'tag2', 'tag3', 'tag4'] // Can be empty list.\n };\n});\n```\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "ConardLi/tpanorama", "link": "https://github.com/ConardLi/tpanorama", "tags": [], "stars": 597, "description": "\u4e00\u6b3e\u975e\u5e38\u597d\u7528\u7684\u5168\u666f\u751f\u6210\uff0c\u5168\u666f\u6807\u8bb0\u7f16\u8f91\u63d2\u4ef6\uff01", "lang": "JavaScript", "repo_lang": "", "readme": "\n![\u6807\u9898](./img/title.png)\n\n# tpanorama\n\n\u63d2\u4ef6\u5305\u62ec\u4e24\u90e8\u5206\uff1a\u5168\u666f\u5c55\u793a\u90e8\u5206\uff0c\u5168\u666f\u6807\u8bb0\u7f16\u8f91\u90e8\u5206\uff0c\u4e8c\u8005\u7ed3\u5408\u4f7f\u7528\u975e\u5e38\u65b9\u4fbf\uff01\n\n# \u672c\u5730\u6d4b\u8bd5\n\n```\nnpm install\nnpm run example\n```\n\n```\nhttp://localhost:3000/\n\nhttp://localhost:3000/page1.html\n```\n\n# \u4f7f\u7528\n\n## 1.\u76f4\u63a5\u5f15\u7528js\n\n1.\u5f15\u7528 [three.js](/origin/three.js)\n\n2.\u5f15\u7528 [tpanorama.js](/origin/tpanorama.js)\n\n```js\n \n \n```\n\n\u53c2\u8003[/examples/page1.html](/examples/page1.html)\n\n## 2.npm\n\n```\nnpm install tpanorama\n```\n\n```js\nvar { tpanorama,tpanoramaSetting} = require('tpanorama');\n```\n\n\u53c2\u8003[/examples/index.js](/examples/index.js)\n\n# 1.\u5168\u666f\u5c55\u793a\n\n![](./img/qj_15_qj.gif)\n\n## 1.1 \u53c2\u6570\u8bf4\u660e\n\n\u53c2\u6570\u540d\u79f0 | \u7c7b\u578b | \u8bf4\u660e \n:-: | :-: | :-: \ncontainer |string| \u5b58\u653e\u5168\u666f\u7684\u5bb9\u5668id \nurl | string | \u5b58\u653e\u5168\u666f\u56fe\u7247\u7684\u8def\u5f84 \nlables | array | {position:{lon:\u7ecf\u5ea6,lat:\u7eac\u5ea6},logoUrl:'logo\u8def\u5f84',text:'\u5185\u5bb9'}\nwidthSegments |num| \u6c34\u5e73\u5207\u6bb5\u6570 \nheightSegments |num| \u5782\u76f4\u5207\u6bb5\u6570\uff08\u503c\u5c0f\u7c97\u7cd9\u901f\u5ea6\u5feb\uff0c\u503c\u5927\u7cbe\u7ec6\u901f\u5ea6\u6162\uff09 \npRadius |num| \u5168\u666f\u7403\u7684\u534a\u5f84\uff0c\u5f71\u54cd\u89c6\u89c9\u6548\u679c\uff0c\u63a8\u8350\u4f7f\u7528\u9ed8\u8ba4\u503c \nminFocalLength |num| \u955c\u5934\u6700\u5c0f\u62c9\u8fd1\u8ddd\u79bb \nmaxFocalLength |num| \u955c\u5934\u6700\u5927\u62c9\u8fd1\u8ddd\u79bb \nspaire |'label'/'icon'| \u663e\u793a\u6807\u8bb0\u7684\u5185\u5bb9\uff0c\u6587\u5b57\u6216\u56fe\u6807\nonClick|func|\u70b9\u51fb\u6807\u8bb0\u7684\u56de\u8c03\u51fd\u6570\n\n## 1.2 \u516c\u7528\u65b9\u6cd5\n\n\n\u65b9\u6cd5\u540d\u79f0 | \u8bf4\u660e \n:-:| :-: \nconfig | \u7ed9\u5168\u666f\u5bf9\u8c61\u8bbe\u7f6e\u914d\u7f6e\u4fe1\u606f\ninit | \u521d\u59cb\u5316\u5168\u666f\u5bf9\u8c61 \nclean | \u6e05\u9664\u5168\u666f\u5bf9\u8c61 \n\n## 1.3 \u4f7f\u7528\n\n\u521d\u59cb\u5316\uff1a\uff08\u53c2\u6570\u4e0d\u8bbe\u7f6e\u5219\u91c7\u7528\u9ed8\u8ba4\u53c2\u6570\uff09\n\n```js\n var opt,tp;\n window.onload = function () {\n opt = {\n container:'panoramaConianer',//\u5bb9\u5668\n url:'img/p1.png',\n lables:[\n {position:{lon:180,lat:0},logoUrl:'',text:'\u6211\u662f\u4e00\u4e2a\u6807\u8bb0'}\n ],\n widthSegments: 60,//\u6c34\u5e73\u5207\u6bb5\u6570\n heightSegments: 40,//\u5782\u76f4\u5207\u6bb5\u6570\uff08\u503c\u5c0f\u7c97\u7cd9\u901f\u5ea6\u5feb\uff0c\u503c\u5927\u7cbe\u7ec6\u901f\u5ea6\u6162\uff09\n pRadius: 1000,//\u5168\u666f\u7403\u7684\u534a\u5f84\uff0c\u63a8\u8350\u4f7f\u7528\u9ed8\u8ba4\u503c\n minFocalLength: 6,//\u955c\u5934\u6700a\u5c0f\u62c9\u8fd1\u8ddd\u79bb\n maxFocalLength: 100,//\u955c\u5934\u6700\u5927\u62c9\u8fd1\u8ddd\u79bb\n showlable: 'show' // show,click\n }\n tp = new tpanorama(opt);\n tp.init();\n }\n```\n\n\u4fee\u6539\u67d0\u4e9b\u53c2\u6570\n\n```js\nopt.showlable = 'click';\nopt.lables = [{position:{lon:180,lat:0},logoUrl:'img/logo.png',text:'\u70b9\u51fb\u4e86\u8fd9\u4e2a\u6807\u8bb0'}];\nopt.url = 'img/p1.png';\ntp.clean();\ntp.config(opt);\ntp.init();\n```\n\n# 2.\u6dfb\u52a0\u5168\u666f\u6807\u8bb0\u5de5\u5177\n\n![](./img/qj_17_qj.gif)\n\n\u5728\u5c55\u793a\u5168\u666f\u7684\u65f6\u5019\uff0c\u4f60\u6216\u8bb8\u4f1a\u5bf9\u6807\u8bb0\u7684\u4f4d\u7f6e\u4ea7\u751f\u4e86\u7591\u95ee\uff0c\u5982\u4f55\u786e\u5b9a\u6807\u8bb0\u7684\u4f4d\u7f6e\uff1f\n\n\u6211\u4eec\u53ef\u4ee5\u4f7f\u7528\u7c7b\u4f3c\u7ecf\u7eac\u5ea6\u7684\u53c2\u6570\u6765\u8868\u8fbe\u5b83\uff0c\u6ce8\u610f\u8fd9\u91cc\u4f7f\u7528\u7684\u7ecf\u7eac\u5ea6\u5e76\u4e0d\u662f\u771f\u6b63\u7684\u7ecf\u7eac\u5ea6\uff0c\u662f\u6211\u4eec\u6839\u636e\u5730\u7403\u7684\u7ecf\u7eac\u5ea6\u6a21\u62df\u51fa\u6765\u7684\u4e00\u4e2a\u53c2\u6570\u3002\n\n\u4e0b\u9762\u8fd9\u4e2a\u5de5\u5177\u5c31\u662f\u7528\u4e8e\u83b7\u53d6\u6211\u4eec\u60f3\u6807\u8bb0\u4f4d\u7f6e\u7684 '\u7ecf\u7eac\u5ea6'\u7684\u4e00\u79cd\u65b9\u6cd5\uff0c\u6709\u4e86\u8fd9\u4e2a\u5de5\u5177\u5c31\u53ef\u4ee5\u5b8c\u7f8e\u7ed3\u5408\u4e0a\u9762\u7684\u5168\u666f\u5c55\u793a\u5de5\u5177\u6765\u4f7f\u7528\u4e86\u3002\n\n## 2.1 \u53c2\u6570\u8bf4\u660e\n\n\u53c2\u6570\u540d\u79f0 | \u7c7b\u578b | \u8bf4\u660e \n:-: | :-: | :-: \ncontainer |string| \u5b58\u653e\u5168\u666f\u8bbe\u7f6e\u7684\u5bb9\u5668id \nimgUrl | string | \u5168\u666f\u56fe\u8def\u5f84 \nwidth | string | \u6307\u5b9a\u5bbd\u5ea6\uff08\u8fd9\u91cc\u56fe\u7247\u5fc5\u987b\u4e25\u683c\u6309\u6bd4\u4f8b\u653e\u7f6e\uff09\uff0c\u9ad8\u5ea6\u81ea\u9002\u5e94 \nshowGrid | bool | \u662f\u5426\u663e\u793a\u5168\u666f\u56fe\u7684\u7f51\u683c\nshowPosition | bool | \u662f\u5426\u663e\u793a\u7ecf\u7eac\u5ea6\u4fe1\u606f\u6846 \nlableColor | string | \u6807\u8bb0\u5728\u56fe\u4e0a\u7684\u989c\u8272 \ngridColor | string | \u7ed8\u5236\u683c\u7f51\u7684\u989c\u8272 \nlables | array | \u4ee5\u524d\u6807\u8bb0\u8fc7\u7684\u6807\u8bb0 {lon:114,lat:38,text:'\u6807\u8bb0\u4e00'}\naddLable | bool | \u662f\u5426\u5f00\u542f\u53cc\u51fb\u6dfb\u52a0\u6807\u8bb0(\u5fc5\u987b\u5f00\u542f\u7ecf\u7eac\u5ea6\u63d0\u793a)\ngetLable | bool | \u662f\u5426\u5f00\u542f\u53f3\u952e\u67e5\u8be2\u6807\u8bb0 (\u5fc5\u987b\u5f00\u542f\u7ecf\u7eac\u5ea6\u63d0\u793a)\ndeleteLbale | bool | \u5f00\u542f\u9ed8\u8ba4\u4e2d\u952e\u5220\u9664 \uff08\u5fc5\u987b\u5f00\u542f\u7ecf\u7eac\u5ea6\u63d0\u793a\uff09\n\n## 2.2 \u516c\u5171\u65b9\u6cd5\n\n\u65b9\u6cd5\u540d\u79f0 | \u8bf4\u660e \n:-:| :-: \nconfig | \u7ed9\u5168\u666f\u5bf9\u8c61\u91cd\u65b0\u8bbe\u7f6e\u914d\u7f6e\u4fe1\u606f\ninit | \u521d\u59cb\u5316\u5168\u666f\u8bbe\u7f6e\u5bf9\u8c61 \ngetAllLables | \u83b7\u53d6\u6240\u6709\u5df2\u7ecf\u6dfb\u52a0\u7684\u6807\u8bb0 \naddLable | \u7528\u4e8e\u624b\u52a8\u8bbe\u7f6e\u6dfb\u52a0\u6807\u8bb0\ngetLable | \u7528\u4e8e\u624b\u52a8\u8bbe\u7f6e\u83b7\u53d6\u6807\u8bb0\ndelete | \u7528\u4e8e\u624b\u52a8\u5220\u9664\u6807\u8bb0\nlisten | \u5bf9\u5168\u666f\u5bf9\u8c61\u76d1\u542c\u4e8b\u4ef6\n\n## 2.3 \u4f7f\u7528\n\n### 2.3.1 \u9ed8\u8ba4\u53c2\u6570\u4f7f\u7528\n\n\u4f7f\u7528\u9ed8\u8ba4\u53c2\u6570\uff0c\u5bf9\u6807\u8bb0\u7684\u589e\u5220\u67e5\u6539\u5df2\u5c01\u88c5\u597d\uff0c\u6240\u6709\u6807\u8bb0\u8bbe\u7f6e\u5b8c\u6210\u65f6\u53ef\u4f7f\u7528getAll\u65b9\u6cd5\u4e0e\u6570\u636e\u5e93\u8fdb\u884c\u4ea4\u4e92\n\n\u521d\u59cb\u5316\uff08\u53c2\u6570\u4e0d\u8bbe\u7f6e\u5219\u91c7\u7528\u9ed8\u8ba4\u53c2\u6570\uff09\n\n```js\n var opt,s;\n window.onload = function () {\n opt = {\n container: 'set',//setting\u5bb9\u5668\n imgUrl: 'img/p3.png',\n width: '1000px',//\u6307\u5b9a\u5bbd\u5ea6\uff0c\u9ad8\u5ea6\u81ea\u9002\u5e94\n showGrid: true,//\u662f\u5426\u663e\u793a\u683c\u7f51\n showPosition: true,//\u662f\u5426\u663e\u793a\u7ecf\u7eac\u5ea6\u63d0\u793a\n lableColor: '#9400D3',//\u6807\u8bb0\u989c\u8272\n gridColor: '#48D1CC',//\u683c\u7f51\u989c\u8272\n lables: [\n {lon:-72.00,lat:9.00,text:'\u84dd\u7a97\u6237'},{lon:114.12,lat:69.48,text:'\u4e00\u7247\u4e91\u5f69'},{lon:132.48,lat:-12.24,text:'\u5927\u6d77'}\n ],//\u6807\u8bb0 {lon:114,lat:38,text:'\u6807\u8bb0\u4e00'}\n addLable: true,//\u5f00\u542f\u540e\u53cc\u51fb\u6dfb\u52a0\u6807\u8bb0 (\u5fc5\u987b\u5f00\u542f\u7ecf\u7eac\u5ea6\u63d0\u793a)\n getLable: true,//\u5f00\u542f\u540e\u53f3\u952e\u67e5\u8be2\u6807\u8bb0 (\u5fc5\u987b\u5f00\u542f\u7ecf\u7eac\u5ea6\u63d0\u793a)\n deleteLbale:true//\u5f00\u542f\u540e\u4e2d\u952e\u5220\u9664(\u5fc5\u987b\u5f00\u542f\u7ecf\u7eac\u5ea6\u63d0\u793a)\n };\n s = new tpanoramaSetting(opt);\n s.init();\n }\n```\n\n\u53c2\u6570\u5207\u6362\n\n```js\n function changeImg(name) {\n if (name == \"p1\"){\n opt.lables = [{lon:178.56,lat:-15.84,text:'\u795e\u50cf'}]\n }\n if (name == \"p2\"){\n opt.lables = [{lon:-80.64,lat:-16.92,text:'\u84dd\u8272'},{lon:46.80,lat:10.44,text:'\u7eff\u8272'}]\n }\n if (name == \"p4\"){\n opt.lables = [{lon:48.96,lat:-20.16,text:'\u6a31\u82b1'}]\n }\n opt.imgUrl = 'img/'+name+'.png';\n s.clean();\n s.config(opt);\n s.init();\n }\n```\n\n\n\n### 2.3.2 \u81ea\u5b9a\u4e49\u4e8b\u4ef6\n\n\u5f88\u591a\u60c5\u51b5\u4e0b\u9ed8\u8ba4\u53c2\u6570\u4e0d\u80fd\u6ee1\u8db3\u6211\u4eec\u7684\u4e1a\u52a1\u9700\u6c42\uff0c\u8fd9\u65f6\u53ef\u4ee5\u81ea\u5b9a\u4e49\u4e8b\u4ef6\u3002\n\n\u63d2\u4ef6\u63d0\u4f9b\u4e86listen\u51fd\u6570\u7528\u4e8e\u76d1\u542c\u5404\u79cd\u4e8b\u4ef6\u3002\n\n\u6dfb\u52a0\u6807\u8bb0\uff1a\n\n\n```js\n s.listen('dblclick',function (e) {\n var text = prompt(\"\u6807\u8bb0\u540d\u79f0\");\n if (text!=null && text!= undefined && text!=\"\") {\n s.addLable(e,text);\n alert(\"\u6dfb\u52a0\u6807\u8bb0\uff1a\"+text+\" \u540e\u53f0\u4ea4\u4e92\");\n }\n });\n```\n\n\u67e5\u8be2\u6807\u8bb0:\n\n```js\n s.listen('mousedown',function (e) {\n if (e.button == 2) {\n var p = s.getLable(e);\n if (p.lon!=null &&p.lon!=undefined&&p.lon!=\"\" ) {\n alert(\"\u7ecf\u5ea6\uff1a\" + p.lon + \",\u7eac\u5ea6\uff1a\" + p.lat + \",\u540d\u79f0\uff1a\" + p.text +\" \u5176\u4ed6\u64cd\u4f5c\");\n }\n }\n });\n```\n\n\u5220\u9664\u6807\u8bb0:\n\n```js\n s.listen('mousedown',function (e) {\n if (e.button == 1) {\n var p = s.getLable(e);\n if (p.lon!=null &&p.lon!=undefined&&p.lon!=\"\" ) {\n var c = confirm(\"\u60a8\u786e\u8ba4\u8981\u5220\u9664\u8be5\u6807\u8bb0\u5417\uff1f\");\n if (c) {\n s.delete(p);\n s.clean();\n s.init();\n alert(\"\u5220\u9664\u6210\u529f\uff01 \u540e\u53f0\u4ea4\u4e92\")\n }\n }\n }\n });\n```\n\n\u89c9\u5f97\u672c\u63d2\u4ef6\u6709\u7528\u7684\u8bb0\u5f97\u70b9\u8d5e\u7ed9\u661f\uff01\u8c22\u8c22\u5927\u5bb6\u652f\u6301\uff01\n\n\n\n# \u5173\u4e8e\n\n\u60a8\u8fd8\u53ef\u4ee5\u5728\u4e0b\u9762\u7684\u5730\u65b9\u5173\u6ce8\u6211\uff0c\u5171\u540c\u5b66\u4e60\u8fdb\u6b65\u3002\n\n\n\n \n \n\n\n \n \n\n\n \n \n\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "dg92/Performance-Analysis-JS", "link": "https://github.com/dg92/Performance-Analysis-JS", "tags": ["javascript", "perfromance", "map", "reduce", "filter", "find", "javascript-functions", "es6", "lodash", "lodash-analysis", "functional-programming", "ramdajs", "ramda", "benchmarking"], "stars": 597, "description": "Map/Reduce/Filter/Find Vs For loop Vs For each Vs Lodash vs Ramda", "lang": "JavaScript", "repo_lang": "", "readme": "# Performance-Analysis\nComparing native JavaScript array methods map, reduce, filter, and find against for loop, forEach loop and lodash methods. The analysis uses basic operations and heavy data manipulation to analyze the execution speed of each method.\n\n### To run \n 1. Run `npm install`\n 2. Generate the [data](data.js) for the tests by running `npm run seed`. \n - The default array is 10000 elements in length. You can create an array of a custom length by passing the desired size as an arugment, like so `npm run seed 100000`.\n 2. For a small data set performance report run `npm run t:s`. \n - This runs the analysis on the first 5 elements of the array.\n 4. For a performance report on the whole array run `npm run t:l`\n\n To test your own function create them in the [formulas.js](formulas.js) file.\n \n### Results for small data set of array size 5 - 1000 \n![small_data_set_result](./small_data_set_result.png)\n\n### Results for mid data set of array size 3000 - 20000\n![mid_data_set_result](./mid_data_set_result.png)\n\n### Results for large data set of array size 50000 - 1000000\n![large_data_set_result](./large_data_set_result.png)\n\n### Coming Soon\n1. Ramda.js test\n2. Caching (inline, warm) considerations\n3. GC considerations\n\n### Note\n1. These results are computed using Node V8 v5.8.283.41\n2. These result does not consider the JIT, inline caching, hidden classes, deoptimizations, garbage collection, pretenuring etc.\n3. Result may vary as per env's.\n4. Red colour highlight in the above images is just for reference, will soon change.\n\n### Discussion/Posts\n1. [https://news.ycombinator.com/item?id=17050798](https://news.ycombinator.com/item?id=17050798)\n2. [https://medium.com/@ideepak.jsd/javascript-performance-test-for-vs-for-each-vs-map-reduce-filter-find-32c1113f19d7](https://medium.com/@ideepak.jsd/javascript-performance-test-for-vs-for-each-vs-map-reduce-filter-find-32c1113f19d7)\n\n\n\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "arvindr21/diskDB", "link": "https://github.com/arvindr21/diskDB", "tags": [], "stars": 596, "description": "A Lightweight Disk based JSON Database with a MongoDB like API for Node", "lang": "JavaScript", "repo_lang": "", "readme": "# diskDB [![Build Status](https://secure.travis-ci.org/arvindr21/diskDB.png?branch=master)](https://travis-ci.org/arvindr21/diskDB) [![NPM version](https://badge-me.herokuapp.com/api/npm/diskdb.png)](http://badges.enytc.com/for/npm/diskdb) [![Gitter](https://badges.gitter.im/Join%20Chat.svg)](https://gitter.im/arvindr21/diskDB?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)\n\n[![NPM](https://nodei.co/npm/diskdb.png?downloads=true&stars=true)](https://nodei.co/npm/diskdb/)\n\nA Lightweight Disk based JSON Database with a MongoDB like API for Node.\n\n_You will never know that you are interacting with a File System_\n\n## Contents\n\n* [Getting Started](#getting-started)\n* [Documentation](#documentation)\n * [Connect](#connect-to-db)\n * [Load Collections](#load-collections)\n * [Write/Save](#writesave-to-collection)\n * [Read](#read-from-collection)\n * [Update](#update-collection)\n * [Remove](#remove-collection)\n * [Count](#count)\n* [Examples](#examples)\n* [Performance](#performance)\n* [Contributing](#contributing)\n* [Release History](#release-history)\n\n## Getting Started\nInstall the module locally : \n```bash\n$ npm install diskdb\n```\n\n```js\nvar db = require('diskdb');\ndb = db.connect('/path/to/db-folder', ['collection-name']);\n// you can access the traditional JSON DB methods here\n```\n\n## Documentation\n### Connect to DB\n```js\ndb.connect(pathToFolder, ['filename']);\n```\nFilename will be the name of the JSON file. You can omit the extension, diskDB will take care of it for you.\n\n```js\nvar db = require('diskdb');\ndb = db.connect('/examples/db', ['articles']);\n// or simply\ndb.connect('/examples/db', ['articles']);\n```\n\nThis will check for a directory at given path, if it does not exits, diskDB will throw an error and exit.\n\nIf the directory exists but the file/collection does not exist, diskDB will create it for you.\n\n**Note** : If you have manually created a JSON file, please make sure it contains a valid JSON array, otherwise diskDB\nwill return an empty array.\n\n```js\n[]\n```\nElse it will throw an error like\n\n```bash\nundefined:0\n\n^\nSyntaxError: Unexpected end of input\n```\n---\n### Load Collections\nAlternatively you can also load collections like\n\n```js\nvar db = require('diskdb');\n// this\ndb = db.connect('/examples/db');\ndb.loadCollections(['articles']);\n//or\ndb.connect('/examples/db');\ndb.loadCollections(['articles']);\n//or\ndb.connect('/examples/db')\n .loadCollections(['articles']);\n//or\ndb.connect('/examples/db', ['articles']);\n```\n#### Load Multiple Collections\n\n```js\nvar db = require('diskdb');\ndb.connect('/examples/db', ['articles','comments','users']);\n```\n---\n### Write/Save to Collection\n```js\ndb.collectionName.save(object);\n```\nOnce you have loaded a collection, you can access the collection's methods using the dot notation like\n\n```js\ndb.[collectionName].[methodname]\n```\nTo save the data, you can use\n```js\nvar db = require('diskdb');\ndb.connect('db', ['articles']);\nvar article = {\n title : \"diskDB rocks\",\n published : \"today\",\n rating : \"5 stars\"\n}\ndb.articles.save(article);\n// or\ndb.articles.save([article]);\n```\nThe saved data will be\n```js\n[\n {\n \"title\": \"diskDB rocks\",\n \"published\": \"today\",\n \"rating\": \"5 stars\",\n \"_id\": \"0f6047c6c69149f0be0c8f5943be91be\"\n }\n]\n```\nYou can also save multiple objects at once like\n\n```js\nvar db = require('diskdb');\ndb.connect('db', ['articles']);\nvar article1 = {\n title : 'diskDB rocks',\n published : 'today',\n rating : '5 stars'\n}\n\nvar article2 = {\n title : 'diskDB rocks',\n published : 'yesterday',\n rating : '5 stars'\n}\n\nvar article3 = {\n title : 'diskDB rocks',\n published : 'today',\n rating : '4 stars'\n}\ndb.articles.save([article1, article2, article3]);\n```\nAnd this will return the inserted objects\n\n```js\n[ { title: 'diskDB rocks',\n published: 'today',\n rating: '4 stars',\n _id: 'b1cdbb3525b84e8c822fc78896d0ca7b' },\n { title: 'diskDB rocks',\n published: 'yesterday',\n rating: '5 stars',\n _id: '42997c62e1714e9f9d88bf3b87901f3b' },\n { title: 'diskDB rocks',\n published: 'today',\n rating: '5 stars',\n _id: '4ca1c1597ddc4020bc41b4418e7a568e' } ]\n```\n---\n### Read from Collection\nThere are 2 methods available for reading the JSON collection\n* db.collectionName.find(query)\n* db.collectionName.findOne(query)\n\n\n#### db.collectionName.find()\n```js\nvar db = require('diskdb');\ndb.connect('/examples/db', ['articles']);\ndb.articles.find();\n```\nThis will return all the records\n```js\n[{\n title: 'diskDB rocks',\n published: 'today',\n rating: '5 stars',\n _id: '0f6047c6c69149f0be0c8f5943be91be'\n}]\n```\nYou can also query with a criteria like\n```js\nvar db = require('diskdb');\ndb.connect('/examples/db', ['articles']);\ndb.articles.find({rating : \"5 stars\"});\n```\nThis will return all the articles which have a rating of 5.\n\nFind can take multiple criteria\n```js\nvar db = require('diskdb');\ndb.connect('/examples/db', ['articles']);\ndb.articles.find({rating : \"5 stars\", published: \"yesterday\"});\n```\nThis will return all the articles with a rating of 5, published yesterday.\n\nNested JSON :\n\n```js\nvar articleComments = {\n title: 'diskDB rocks',\n published: '2 days ago',\n comments: [{\n name: 'a user',\n comment: 'this is cool',\n rating: 2\n }, {\n name: 'b user',\n comment: 'this is ratchet',\n rating: 3\n }, {\n name: 'c user',\n comment: 'this is awesome',\n rating: 2\n }]\n}\n```\n```js\nvar savedArticle = db.articles.save([articleComments);\nfoundArticles = db.articles.find({rating : 2});\n```\nSince diskDB is mostly for light weight data storage, avoid nested structures and huge datasets.\n\n#### db.collectionName.findOne(query)\n```js\nvar db = require('diskdb');\ndb.connect('/examples/db', ['articles']);\ndb.articles.findOne();\n```\n\nIf you do not pass a query, diskDB will return the first article in the collection. If you pass a query, it will return first article in the filtered data.\n\n```js\nvar db = require('diskdb');\ndb.connect('/examples/db', ['articles']);\ndb.articles.findOne({_id: '0f6047c6c69149f0be0c8f5943be91be'});\n```\n---\n### Update Collection\n```js\ndb.collectionName.update(query, data, options);\n```\n\nYou can also update one or many objects in the collection\n```js\noptions = {\n multi: false, // update multiple - default false\n upsert: false // if object is not found, add it (update-insert) - default false\n}\n```\nUsage\n```js\nvar db = require('diskdb');\ndb.connect('/examples/db', ['articles']);\n\nvar query = {\n title : 'diskDB rocks'\n};\n\nvar dataToBeUpdate = {\n title : 'diskDB rocks again!',\n};\n\nvar options = {\n multi: false,\n upsert: false\n};\n\nvar updated = db.articles.update(query, dataToBeUpdate, options);\nconsole.log(updated); // { updated: 1, inserted: 0 }\n```\n---\n### Remove Collection\n```js\ndb.collectionName.remove(query, multi);\n```\nYou can remove the entire collection (including the file) or you can remove the matched objects by passing in a query. When you pass a query, you can either delete all the matched objects or only the first one by passing `multi` as `false`. The default value of `multi` is `true`.\n\n```js\nvar db = require('diskdb');\ndb.connect('/examples/db', ['articles']);\ndb.articles.remove({rating : \"5 stars\"});\n```\n```js\nvar db = require('diskdb');\ndb.connect('/examples/db', ['articles']);\ndb.articles.remove({rating : \"5 stars\"}, true); // remove all matched. Default - multi = true\n```\n\n```js\nvar db = require('diskdb');\ndb.connect('/examples/db', ['articles']);\ndb.articles.remove({rating : \"5 stars\"}, false); // remove only the first match\n```\nUsing remove without any params will delete the file and will remove the db instance.\n```js\nvar db = require('diskdb');\ndb.connect('/examples/db', ['articles']);\ndb.articles.remove();\n```\nAfter the above operation `db.articles` is `undefined`.\n\n---\n### Count\n```js\ndb.collectionName.count();\n```\nWill return the count of objects in the Collection\n```js\nvar db = require('diskdb');\ndb.connect('/examples/db', ['articles']);\ndb.articles.count(); // will give the count\n```\n\n## Examples\nRefer to the [examples](https://github.com/arvindr21/diskDB/tree/master/examples) folder.\n\n## Performance\nTo validate diskDB's performance and to check if it meets your needs, you can clone this repo and run\n\n```bash\n$ node performance/time.js\n```\nAn average of few tests (run on OS X - 10.9.3 | 2.9GHZ i7 | 8GB 1600MHz DDR3) can be found below\n\n#### Time taken to process x number of objects (in ms) vs Action Performed\n\n\\# of objects | 1 | 1000 | 10000 | 100000 | 1000000\n-----------------------|------------|------------|------------|------------|-------------\nSave | 1 ms | 15 ms | 137 ms | 1728 ms | 14425 ms \nFind all without query | 0 ms | 2 ms | 12 ms | 204 ms | 2923 ms \nFind all with query | 0 ms | 2 ms | 17 ms | 738 ms | 1985 ms \nFind one without query | 0 ms | 1 ms | 9 ms | 791 ms | 1676 ms \nFind one with query | 0 ms | 1 ms | 8 ms | 219 ms | 1410 ms \nUpdate all records | 1 ms | 7 ms | 61 ms | 206 ms | 48035 ms \nGet count | 0 ms | 3 ms | 11 ms | 260 ms | 2420 ms \nRemove with query | 0 ms | 7 ms | 59 ms | 984 ms | 48191 ms \nRemove collection | 0 ms | 1 ms | 4 ms | 52 ms | 154 ms \nFile size | 0.000111 MB| 0.116671 MB| 1.196671 MB| 12.26667 MB| 125.66667 MB\n\n\n## Contributing\nSee the [CONTRIBUTING Guidelines](https://github.com/arvindr21/diskDB/blob/master/CONTRIBUTING.md)\n\n## Release History\n* 0.1.x\n * Base Module with\n * Connect to a Folder\n * Access a Collection/File\n * Create Read Update Delete on JSON object\n * Minor fixes and tests\n * Performance improvements\n\n## License\nCopyright (c) 2014 Arvind Ravulavaru. Licensed under the MIT license.\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "vue-gl/vue-gl", "link": "https://github.com/vue-gl/vue-gl", "tags": ["vuejs", "vuejs2", "vue2", "vue-components", "threejs", "three-js", "three", "vue", "3d", "3d-graphics", "webgl", "graphics", "tags", "tag", "element", "elements", "custom", "html", "web"], "stars": 597, "description": "Vue.js components rendering 3D WebGL graphics reactively with three.js", "lang": "JavaScript", "repo_lang": "", "readme": "# VueGL\n\n[Vue.js](https://vuejs.org/) components rendering 3D WebGL graphics reactively with\n[three.js](https://threejs.org/).\n\n[![NPM](https://nodei.co/npm/vue-gl.png?compact=true)](https://nodei.co/npm/vue-gl/\n) \n[![Financial Contributors on Open Collective](https://opencollective.com/vue-gl/all/badge.svg?label=financial+contributors)](https://opencollective.com/vue-gl)\n\n## Usage\n\n```html\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n\n\n\n```\n\n[See the documentation](//vue-gl.github.io) for more information.\n\n## Available components\n\n[Components reference](//vue-gl.github.io/components/) shows a list of available\ncore components. [Example components reference](//vue-gl.github.io/examples/) also\nintroduces additional components you can use immediately.\n\nThe list of components not implemented yet can be found at [this project](https://github.com/vue-gl/vue-gl/projects/1).\n\n## Contribution\n\nAre you interested in enhance this product?\nWe're really glad and thanks a lot! \nSee [Contributing guidelines](CONTRIBUTING.md) to get started.\n\n### Code Contributors\n\nThis project exists thanks to all the people who contribute. [[Contribute](CONTRIBUTING.md)].\n\n \n \n\n### Financial Contributors\n\nBecome a financial contributor and help us sustain our community. [[Contribute](https://opencollective.com/vue-gl/contribute)]\n\n#### Individuals\n\n \n\n#### Organizations\n\nSupport this project with your organization. Your logo will show up here with a\nlink to your website. [[Contribute](https://opencollective.com/vue-gl/contribute)]\n\n \n \n \n \n \n \n \n \n \n \n\n## License\n\n[![FOSSA Status](https://app.fossa.io/api/projects/git%2Bgithub.com%2Fvue-gl%2Fvue-gl.svg?type=large)](https://app.fossa.io/projects/git%2Bgithub.com%2Fvue-gl%2Fvue-gl?ref=badge_large)\n", "readme_type": "markdown", "hn_comments": "Handy article but I still wish there was an easier way to use filters.", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "googleapis/nodejs-firestore", "link": "https://github.com/googleapis/nodejs-firestore", "tags": ["nodejs", "firestore", "database", "nosql"], "stars": 597, "description": "Node.js client for Google Cloud Firestore: a NoSQL document database built for automatic scaling, high performance, and ease of application development.", "lang": "JavaScript", "repo_lang": "", "readme": "[//]: # \"This README.md file is auto-generated, all changes to this file will be lost.\"\n[//]: # \"To regenerate it, use `python -m synthtool`.\"\n \n\n# [Cloud Firestore: Node.js Client](https://github.com/googleapis/nodejs-firestore)\n\n[![release level](https://img.shields.io/badge/release%20level-stable-brightgreen.svg?style=flat)](https://cloud.google.com/terms/launch-stages)\n[![npm version](https://img.shields.io/npm/v/@google-cloud/firestore.svg)](https://www.npmjs.org/package/@google-cloud/firestore)\n\n\n\n\nThis is the Node.js Server SDK for [Google Cloud Firestore](https://firebase.google.com/docs/firestore/). Google Cloud Firestore is a NoSQL document database built for automatic scaling, high performance, and ease of application development.\n\nThis Cloud Firestore Server SDK uses Google\u2019s Cloud Identity and Access Management for authentication and should only be used in trusted environments. Your Cloud Identity credentials allow you bypass all access restrictions and provide read and write access to all data in your Cloud Firestore project.\n\nThe Cloud Firestore Server SDKs are designed to manage the full set of data in your Cloud Firestore project and work best with reliable network connectivity. Data operations performed via these SDKs directly access the Cloud Firestore backend and all document reads and writes are optimized for high throughput.\n\nApplications that use Google's Server SDKs should not be used in end-user environments, such as on phones or on publicly hosted websites. If you are developing a Web or Node.js application that accesses Cloud Firestore on behalf of end users, use the firebase Client SDK.\n\n**Note:** This Cloud Firestore Server SDK does not support Firestore databases created in [Datastore mode](https://cloud.google.com/datastore/docs/firestore-or-datastore#in_datastore_mode). To access these databases, use the [Datastore SDK](https://www.npmjs.com/package/@google-cloud/datastore).\n\n\nA comprehensive list of changes in each version may be found in\n[the CHANGELOG](https://github.com/googleapis/nodejs-firestore/blob/main/CHANGELOG.md).\n\n* [Cloud Firestore Node.js Client API Reference][client-docs]\n* [Cloud Firestore Documentation][product-docs]\n* [github.com/googleapis/nodejs-firestore](https://github.com/googleapis/nodejs-firestore)\n\nRead more about the client libraries for Cloud APIs, including the older\nGoogle APIs Client Libraries, in [Client Libraries Explained][explained].\n\n[explained]: https://cloud.google.com/apis/docs/client-libraries-explained\n\n**Table of contents:**\n\n\n* [Quickstart](#quickstart)\n * [Before you begin](#before-you-begin)\n * [Installing the client library](#installing-the-client-library)\n * [Using the client library](#using-the-client-library)\n* [Samples](#samples)\n* [Versioning](#versioning)\n* [Contributing](#contributing)\n* [License](#license)\n\n## Quickstart\n\n### Before you begin\n\n1. [Select or create a Cloud Platform project][projects].\n1. [Enable the Cloud Firestore API][enable_api].\n1. [Set up authentication with a service account][auth] so you can access the\n API from your local workstation.\n\n### Installing the client library\n\n```bash\nnpm install @google-cloud/firestore\n```\n\n\n### Using the client library\n\n```javascript\nconst {Firestore} = require('@google-cloud/firestore');\n\n// Create a new client\nconst firestore = new Firestore();\n\nasync function quickstart() {\n // Obtain a document reference.\n const document = firestore.doc('posts/intro-to-firestore');\n\n // Enter new data into the document.\n await document.set({\n title: 'Welcome to Firestore',\n body: 'Hello World',\n });\n console.log('Entered new data into the document');\n\n // Update an existing document.\n await document.update({\n body: 'My first Firestore app',\n });\n console.log('Updated an existing document');\n\n // Read the document.\n const doc = await document.get();\n console.log('Read the document');\n\n // Delete the document.\n await document.delete();\n console.log('Deleted the document');\n}\nquickstart();\n\n```\n\n\n\n## Samples\n\nSamples are in the [`samples/`](https://github.com/googleapis/nodejs-firestore/tree/main/samples) directory. Each sample's `README.md` has instructions for running its sample.\n\n| Sample | Source Code | Try it |\n| --------------------------- | --------------------------------- | ------ |\n| Limit-to-last-query | [source code](https://github.com/googleapis/nodejs-firestore/blob/main/samples/limit-to-last-query.js) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/nodejs-firestore&page=editor&open_in_editor=samples/limit-to-last-query.js,samples/README.md) |\n| Quickstart | [source code](https://github.com/googleapis/nodejs-firestore/blob/main/samples/quickstart.js) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/nodejs-firestore&page=editor&open_in_editor=samples/quickstart.js,samples/README.md) |\n| Solution-counters | [source code](https://github.com/googleapis/nodejs-firestore/blob/main/samples/solution-counters.js) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/nodejs-firestore&page=editor&open_in_editor=samples/solution-counters.js,samples/README.md) |\n\n\n\nThe [Cloud Firestore Node.js Client API Reference][client-docs] documentation\nalso contains samples.\n\n## Supported Node.js Versions\n\nOur client libraries follow the [Node.js release schedule](https://nodejs.org/en/about/releases/).\nLibraries are compatible with all current _active_ and _maintenance_ versions of\nNode.js.\nIf you are using an end-of-life version of Node.js, we recommend that you update\nas soon as possible to an actively supported LTS version.\n\nGoogle's client libraries support legacy versions of Node.js runtimes on a\nbest-efforts basis with the following warnings:\n\n* Legacy versions are not tested in continuous integration.\n* Some security patches and features cannot be backported.\n* Dependencies cannot be kept up-to-date.\n\nClient libraries targeting some end-of-life versions of Node.js are available, and\ncan be installed through npm [dist-tags](https://docs.npmjs.com/cli/dist-tag).\nThe dist-tags follow the naming convention `legacy-(version)`.\nFor example, `npm install @google-cloud/firestore@legacy-8` installs client libraries\nfor versions compatible with Node.js 8.\n\n## Versioning\n\nThis library follows [Semantic Versioning](http://semver.org/).\n\n\n\nThis library is considered to be **stable**. The code surface will not change in backwards-incompatible ways\nunless absolutely necessary (e.g. because of critical security issues) or with\nan extensive deprecation period. Issues and requests against **stable** libraries\nare addressed with the highest priority.\n\n\n\n\n\n\nMore Information: [Google Cloud Platform Launch Stages][launch_stages]\n\n[launch_stages]: https://cloud.google.com/terms/launch-stages\n\n## Contributing\n\nContributions welcome! See the [Contributing Guide](https://github.com/googleapis/nodejs-firestore/blob/main/CONTRIBUTING.md).\n\nPlease note that this `README.md`, the `samples/README.md`,\nand a variety of configuration files in this repository (including `.nycrc` and `tsconfig.json`)\nare generated from a central template. To edit one of these files, make an edit\nto its templates in\n[directory](https://github.com/googleapis/synthtool).\n\n## License\n\nApache Version 2.0\n\nSee [LICENSE](https://github.com/googleapis/nodejs-firestore/blob/main/LICENSE)\n\n[client-docs]: https://cloud.google.com/nodejs/docs/reference/firestore/latest\n[product-docs]: https://cloud.google.com/firestore\n[shell_img]: https://gstatic.com/cloudssh/images/open-btn.png\n[projects]: https://console.cloud.google.com/project\n[billing]: https://support.google.com/cloud/answer/6293499#enable-billing\n[enable_api]: https://console.cloud.google.com/flows/enableapi?apiid=firestore.googleapis.com\n[auth]: https://cloud.google.com/docs/authentication/getting-started\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "eggjs/egg-sequelize", "link": "https://github.com/eggjs/egg-sequelize", "tags": ["egg-plugin", "egg", "sequelize", "orm"], "stars": 597, "description": "Sequelize for Egg.js", "lang": "JavaScript", "repo_lang": "", "readme": "# egg-sequelize\n\n[Sequelize](http://sequelizejs.com) plugin for Egg.js.\n\n> NOTE: This plugin just for integrate Sequelize into Egg.js, more documentation please visit http://sequelizejs.com.\n\n[![NPM version][npm-image]][npm-url]\n[![build status][travis-image]][travis-url]\n[![Test coverage][codecov-image]][codecov-url]\n[![David deps][david-image]][david-url]\n[![Known Vulnerabilities][snyk-image]][snyk-url]\n[![npm download][download-image]][download-url]\n\n[npm-image]: https://img.shields.io/npm/v/egg-sequelize.svg?style=flat-square\n[npm-url]: https://npmjs.org/package/egg-sequelize\n[travis-image]: https://img.shields.io/travis/eggjs/egg-sequelize.svg?style=flat-square\n[travis-url]: https://travis-ci.org/eggjs/egg-sequelize\n[codecov-image]: https://codecov.io/gh/eggjs/egg-sequelize/branch/master/graph/badge.svg\n[codecov-url]: https://codecov.io/gh/eggjs/egg-sequelize\n[david-image]: https://img.shields.io/david/eggjs/egg-sequelize.svg?style=flat-square\n[david-url]: https://david-dm.org/eggjs/egg-sequelize\n[snyk-image]: https://snyk.io/test/npm/egg-sequelize/badge.svg?style=flat-square\n[snyk-url]: https://snyk.io/test/npm/egg-sequelize\n[download-image]: https://img.shields.io/npm/dm/egg-sequelize.svg?style=flat-square\n[download-url]: https://npmjs.org/package/egg-sequelize\n\n## Install\n\n```bash\n$ npm i --save egg-sequelize\n$ npm install --save mysql2 # For both mysql and mariadb dialects\n\n# Or use other database backend.\n$ npm install --save pg pg-hstore # PostgreSQL\n$ npm install --save tedious # MSSQL\n```\n\n## Usage & configuration\n\n> Read the [tutorials](https://eggjs.org/en/tutorials/sequelize.html) to see a full example.\n\n- Enable plugin in `config/plugin.js`\n\n``` js\nexports.sequelize = {\n enable: true,\n package: 'egg-sequelize'\n}\n```\n\n- Edit your own configurations in `conif/config.{env}.js`\n\n```js\nexports.sequelize = {\n dialect: 'mysql', // support: mysql, mariadb, postgres, mssql\n database: 'test',\n host: 'localhost',\n port: 3306,\n username: 'root',\n password: '',\n // delegate: 'myModel', // load all models to `app[delegate]` and `ctx[delegate]`, default to `model`\n // baseDir: 'my_model', // load all files in `app/${baseDir}` as models, default to `model`\n // exclude: 'index.js', // ignore `app/${baseDir}/index.js` when load models, support glob and array\n // more sequelize options\n};\n```\n\nYou can also use the `connection uri` to configure the connection:\n\n```js\nexports.sequelize = {\n dialect: 'mysql', // support: mysql, mariadb, postgres, mssql\n connectionUri: 'mysql://root:@127.0.0.1:3306/test',\n // delegate: 'myModel', // load all models to `app[delegate]` and `ctx[delegate]`, default to `model`\n // baseDir: 'my_model', // load all files in `app/${baseDir}` as models, default to `model`\n // exclude: 'index.js', // ignore `app/${baseDir}/index.js` when load models, support glob and array\n // more sequelize options\n};\n```\n\negg-sequelize has a default sequelize options below\n\n```js\n{\n delegate: 'model',\n baseDir: 'model',\n logging(...args) {\n // if benchmark enabled, log used\n const used = typeof args[1] === 'number' ? `[${args[1]}ms]` : '';\n app.logger.info('[egg-sequelize]%s %s', used, args[0]);\n },\n host: 'localhost',\n port: 3306,\n username: 'root',\n benchmark: true,\n define: {\n freezeTableName: false,\n underscored: true,\n },\n };\n```\n\nMore documents please refer to [Sequelize.js](http://docs.sequelizejs.com/manual/installation/usage.html)\n\n## Model files\n\nPlease put models under `app/model` dir by default.\n\n## Conventions\n\n| model file | class name |\n| ---------------- | ----------------------- |\n| `user.js` | `app.model.User` |\n| `person.js` | `app.model.Person` |\n| `user_group.js` | `app.model.UserGroup` |\n| `user/profile.js`| `app.model.User.Profile`|\n\n- Tables always has timestamp fields: `created_at datetime`, `updated_at datetime`.\n- Use underscore style column name, for example: `user_id`, `comments_count`.\n\n## Examples\n\n### Standard\n\nDefine a model first.\n\n> NOTE: `options.delegate` default to `model`, so `app.model` is an [Instance of Sequelize](http://docs.sequelizejs.com/class/lib/sequelize.js~Sequelize.html#instance-constructor-constructor), so you can use methods like: `app.model.sync, app.model.query ...`\n\n```js\n// app/model/user.js\n\nmodule.exports = app => {\n const { STRING, INTEGER, DATE } = app.Sequelize;\n\n const User = app.model.define('user', {\n login: STRING,\n name: STRING(30),\n password: STRING(32),\n age: INTEGER,\n last_sign_in_at: DATE,\n created_at: DATE,\n updated_at: DATE,\n });\n\n User.findByLogin = async function(login) {\n return await this.findOne({\n where: {\n login: login\n }\n });\n }\n\n // don't use arraw function\n User.prototype.logSignin = async function() {\n return await this.update({ last_sign_in_at: new Date() });\n }\n\n return User;\n};\n\n```\n\nNow you can use it in your controller:\n\n```js\n// app/controller/user.js\nclass UserController extends Controller {\n async index() {\n const users = await this.ctx.model.User.findAll();\n this.ctx.body = users;\n }\n\n async show() {\n const user = await this.ctx.model.User.findByLogin(this.ctx.params.login);\n await user.logSignin();\n this.ctx.body = user;\n }\n}\n```\n\n### Associate\n\nDefine all your associations in `Model.associate()` and egg-sequelize will execute it after all models loaded. See example below.\n\n### Multiple Datasources\n\negg-sequelize support load multiple datasources independently. You can use `config.sequelize.datasources` to configure and load multiple datasources.\n\n```js\n// config/config.default.js\nexports.sequelize = {\n datasources: [\n {\n delegate: 'model', // load all models to app.model and ctx.model\n baseDir: 'model', // load models from `app/model/*.js`\n database: 'biz',\n // other sequelize configurations\n },\n {\n delegate: 'admninModel', // load all models to app.adminModel and ctx.adminModel\n baseDir: 'admin_model', // load models from `app/admin_model/*.js`\n database: 'admin',\n // other sequelize configurations\n },\n ],\n};\n```\n\nThen we can define model like this:\n\n```js\n// app/model/user.js\nmodule.exports = app => {\n const { STRING, INTEGER, DATE } = app.Sequelize;\n\n const User = app.model.define('user', {\n login: STRING,\n name: STRING(30),\n password: STRING(32),\n age: INTEGER,\n last_sign_in_at: DATE,\n created_at: DATE,\n updated_at: DATE,\n });\n\n return User;\n};\n\n// app/admin_model/user.js\nmodule.exports = app => {\n const { STRING, INTEGER, DATE } = app.Sequelize;\n\n const User = app.adminModel.define('user', {\n login: STRING,\n name: STRING(30),\n password: STRING(32),\n age: INTEGER,\n last_sign_in_at: DATE,\n created_at: DATE,\n updated_at: DATE,\n });\n\n return User;\n};\n```\n\nIf you define the same model for different datasource, the same model file will be excute twice for different database, so we can use the secound argument to get the sequelize instance:\n\n```js\n// app/model/user.js\n// if this file will load multiple times for different datasource\n// we can use the secound argument to get the sequelize instance\nmodule.exports = (app, model) => {\n const { STRING, INTEGER, DATE } = app.Sequelize;\n\n const User = model.define('user', {\n login: STRING,\n name: STRING(30),\n password: STRING(32),\n age: INTEGER,\n last_sign_in_at: DATE,\n created_at: DATE,\n updated_at: DATE,\n });\n\n return User;\n};\n```\n\n### Customize Sequelize\n\nBy default, egg-sequelize will use sequelize@5, you can cusomize sequelize version by pass sequelize instance with `config.sequelize.Sequelize` like this:\n\n```js\n// config/config.default.js\nexports.sequelize = {\n Sequelize: require('sequelize'),\n};\n```\n\n### Full example\n\n```js\n// app/model/post.js\n\nmodule.exports = app => {\n const { STRING, INTEGER, DATE } = app.Sequelize;\n\n const Post = app.model.define('Post', {\n name: STRING(30),\n user_id: INTEGER,\n created_at: DATE,\n updated_at: DATE,\n });\n\n Post.associate = function() {\n app.model.Post.belongsTo(app.model.User, { as: 'user' });\n }\n\n return Post;\n};\n```\n\n\n```js\n// app/controller/post.js\nclass PostController extends Controller {\n async index() {\n const posts = await this.ctx.model.Post.findAll({\n attributes: [ 'id', 'user_id' ],\n include: { model: this.ctx.model.User, as: 'user' },\n where: { status: 'publish' },\n order: 'id desc',\n });\n\n this.ctx.body = posts;\n }\n\n async show() {\n const post = await this.ctx.model.Post.findByPk(this.params.id);\n const user = await post.getUser();\n post.setDataValue('user', user);\n this.ctx.body = post;\n }\n\n async destroy() {\n const post = await this.ctx.model.Post.findByPk(this.params.id);\n await post.destroy();\n this.ctx.body = { success: true };\n }\n}\n```\n\n## Sync model to db\n\n**We strongly recommend you to use [Sequelize - Migrations](http://docs.sequelizejs.com/manual/tutorial/migrations.html) to create or migrate database.**\n\n**This code should only be used in development.**\n\n```js\n// {app_root}/app.js\nmodule.exports = app => {\n if (app.config.env === 'local' || app.config.env === 'unittest') {\n app.beforeStart(async () => {\n await app.model.sync({force: true});\n });\n }\n};\n```\n\n## Migration\n\nUsing [sequelize-cli](https://github.com/sequelize/cli) to help manage your database, data structures and seed data. Please read [Sequelize - Migrations](http://docs.sequelizejs.com/manual/tutorial/migrations.html) to learn more infomations.\n\n## Recommended example\n\n- https://github.com/eggjs/examples/tree/master/sequelize/\n\n## Questions & Suggestions\n\nPlease open an issue [here](https://github.com/eggjs/egg/issues).\n\n## License\n\n[MIT](LICENSE)\n\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "yyx990803/register-service-worker", "link": "https://github.com/yyx990803/register-service-worker", "tags": [], "stars": 597, "description": "A script to simplify service worker registration with hooks for common events.", "lang": "JavaScript", "repo_lang": "", "readme": "# register-service-worker\n\nA script to simplify service worker registration with hooks for common events.\n\n## Usage\n\n**Note:** this script uses ES modules export and is expected to be used with a client side bundler that can handle ES modules syntax.\n\n``` js\nimport { register } from 'register-service-worker'\n\nregister('/service-worker.js', {\n registrationOptions: { scope: './' },\n ready (registration) {\n console.log('Service worker is active.')\n },\n registered (registration) {\n console.log('Service worker has been registered.')\n },\n cached (registration) {\n console.log('Content has been cached for offline use.')\n },\n updatefound (registration) {\n console.log('New content is downloading.')\n },\n updated (registration) {\n console.log('New content is available; please refresh.')\n },\n offline () {\n console.log('No internet connection found. App is running in offline mode.')\n },\n error (error) {\n console.error('Error during service worker registration:', error)\n }\n})\n```\n\nThe `ready`, `registered`, `cached`, `updatefound` and `updated` events passes a [ServiceWorkerRegistration](https://developer.mozilla.org/en-US/docs/Web/API/ServiceWorkerRegistration) instance in their arguments.\n\nThe `registrationOptions` object will be passed as the second argument to [`ServiceWorkerContainer.register()`](https://developer.mozilla.org/en-US/docs/Web/API/ServiceWorkerContainer/register#Parameters)\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "douglascrockford/JSCheck", "link": "https://github.com/douglascrockford/JSCheck", "tags": [], "stars": 597, "description": "A random property testing tool for JavaScript", "lang": "JavaScript", "repo_lang": "", "readme": "jscheck.js\nDouglas Crockford\n2012-04-24\n\nPublic Domain\n\nJSCheck is a testing tool for JavaScript. It was inspired by QuickCheck, a\ntesting tool for Haskell developed by Koen Claessen and John Hughes of\nChalmers University of Technology.\n\nJSCheck is a specification-driven testing tool. From a description of the\nproperties of a system, function, or object, it will generate random test\ncases attempting to prove those properties, and then report its findings.\nThat can be especially effective in managing the evolution of a program\nbecause it can show the conformance of new code to old code. It also provides\nan interesting level of self-documentation, because the executable\nspecifications it relies on can provide a good view of the workings of a\nprogram.\n\nAll of JSCheck can be loaded from a small file called jscheck.js.\n\nThe source is available at https://github.com/douglascrockford/JSCheck.\nThe documentation is available at https://www.crockford.com/jscheck.html.\n", "readme_type": "text", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "WebReflection/circular-json", "link": "https://github.com/WebReflection/circular-json", "tags": [], "stars": 597, "description": "JSON does not handle circular references. Now it does", "lang": "JavaScript", "repo_lang": "", "readme": "CircularJSON\n============\n\n[![donate](https://img.shields.io/badge/$-donate-ff69b4.svg?maxAge=2592000&style=flat)](https://github.com/WebReflection/donate) ![Downloads](https://img.shields.io/npm/dm/circular-json.svg) [![Build Status](https://travis-ci.org/WebReflection/circular-json.svg?branch=master)](https://travis-ci.org/WebReflection/circular-json) [![Coverage Status](https://coveralls.io/repos/github/WebReflection/circular-json/badge.svg?branch=master)](https://coveralls.io/github/WebReflection/circular-json?branch=master)\n\nSerializes and deserializes otherwise valid JSON objects containing circular references into and from a specialized JSON format.\n\n- - -\n\n## The future of this module is called [flatted](https://github.com/WebReflection/flatted#flatted)\n\nSmaller, faster, and able to produce on average a reduced output too, [flatted](https://github.com/WebReflection/flatted#flatted) is the new, bloatless, ESM and CJS compatible, circular JSON parser.\n\nIt has now reached V1 and it implements the exact same JSON API.\n\nPlease note **CircularJSON is in maintenance only** and **[flatted](https://github.com/WebReflection/flatted#flatted) is its successor**.\n\n- - -\n\n### A Working Solution To A Common Problem\nA usage example:\n\n```JavaScript\nvar object = {};\nobject.arr = [\n object, object\n];\nobject.arr.push(object.arr);\nobject.obj = object;\n\nvar serialized = CircularJSON.stringify(object);\n// '{\"arr\":[\"~\",\"~\",\"~arr\"],\"obj\":\"~\"}'\n// NOTE: CircularJSON DOES NOT parse JS\n// it handles receiver and reviver callbacks\n\nvar unserialized = CircularJSON.parse(serialized);\n// { arr: [ [Circular], [Circular] ],\n// obj: [Circular] }\n\nunserialized.obj === unserialized;\nunserialized.arr[0] === unserialized;\nunserialized.arr.pop() === unserialized.arr;\n```\n\nA quick summary:\n\n * **new** in version `0.5`, you can specify a JSON parser different from JSON itself. `CircularJSON.parser = ABetterJSON;` is all you need.\n * uses `~` as a special prefix symbol to denote which parent the reference belongs to (i.e. `~root~child1~child2`)\n * reasonably fast in both serialization and deserialization\n * compact serialization for easier and slimmer transportation across environments\n * [tested and covered](test/circular-json.js) over nasty structures too\n * compatible with all JavaScript engines\n\nNode Installation & Usage\n============\n\n```bash\nnpm install --save circular-json\n```\n\n```javascript\n'use strict';\n\nvar\n CircularJSON = require('circular-json'),\n obj = { foo: 'bar' },\n str\n;\n \nobj.self = obj;\nstr = CircularJSON.stringify(obj);\n```\n\nThere are no dependencies.\n\nBrowser Installation & Usage\n================\n\n* Global: \n* AMD: \n* CommonJS: \n\n(generated via [gitstrap](https://github.com/WebReflection/gitstrap))\n\n```html\n\n```\n\n```javascript\n'use strict';\n\nvar CircularJSON = window.CircularJSON\n , obj = { foo: 'bar' }\n , str\n ;\n \nobj.self = obj;\nstr = CircularJSON.stringify(obj);\n```\n\nNOTE: Platforms without native JSON (i.e. MSIE <= 8) requires `json3.js` or similar.\n\nIt is also *a bad idea* to `CircularJSON.parse(JSON.stringify(object))` because of those manipulation used in `CircularJSON.stringify()` able to make parsing safe and secure.\n\nAs summary: `CircularJSON.parse(CircularJSON.stringify(object))` is the way to go, same is for `JSON.parse(JSON.stringify(object))`.\n\nAPI\n===\n\nIt's the same as native JSON, except the fourth parameter `placeholder`, which circular references to be replaced with `\"[Circular]\"` (i.e. for logging).\n\n* CircularJSON.stringify(object, replacer, spacer, placeholder)\n* CircularJSON.parse(string, reviver)\n\nBear in mind `JSON.parse(CircularJSON.stringify(object))` will work but not produce the expected output.\n\nSimilar Libraries\n=======\n\n### Why Not the [@izs](https://twitter.com/izs) One\nThe module [json-stringify-safe](https://github.com/isaacs/json-stringify-safe) seems to be for `console.log()` but it's completely pointless for `JSON.parse()`, being latter one unable to retrieve back the initial structure. Here an example:\n\n```JavaScript\n// a logged object with circular references\n{\n \"circularRef\": \"[Circular]\",\n \"list\": [\n \"[Circular]\",\n \"[Circular]\"\n ]\n}\n// what do we do with above output ?\n```\n\nJust type this in your `node` console: `var o = {}; o.a = o; console.log(o);`. The output will be `{ a: [Circular] }` ... good, but that ain't really solving the problem.\n\nHowever, if that's all you need, the function used to create that kind of output is probably faster than `CircularJSON` and surely fits in less lines of code.\n\n\n### Why Not {{put random name}} Solution\nSo here the thing: circular references can be wrong but, if there is a need for them, any attempt to ignore them or remove them can be considered just a failure.\n\nNot because the method is bad or it's not working, simply because the circular info, the one we needed and used in the first place, is lost!\n\nIn this case, `CircularJSON` does even more than just solve circular and recursions: it maps all same objects so that less memory is used as well on deserialization as less bandwidth too!\nIt's able to redefine those references back later on so the way we store is the way we retrieve and in a reasonably performant way, also trusting the snappy and native `JSON` methods to iterate.\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "dirkgroenen/SVGMagic", "link": "https://github.com/dirkgroenen/SVGMagic", "tags": [], "stars": 597, "description": "Fallback for SVG images by automatically creating PNG versions on-the-fly", "lang": "JavaScript", "repo_lang": "", "readme": "[![ScreenShot](http://svgmagic.bitlabs.nl/svgmagic_tweakers.png)](http://svgmagic.bitlabs.nl)\n\nSVGMagic - Cross browser SVG\n========\n\n> **This repository is no longer actively mainted. It has proven to be very usefull back in 2013, but these days SVGs are supported by [pretty much all major browsers](https://caniuse.com/#feat=mdn-svg_elements_g).\n> The plugin will continue to work, but no guarantees will be given over the API's availability. It has been a great ride!**\n\nThis ease-to-use jQuery plugin will create a fallback for .SVG images on your website. When the plugin notices that the visitors browser doesn't support .SVG images it will replace those images with new .PNG images. Those .PNG images are created on the run using a serverside script. When the visitors browser does support .SVG images it will just go back to sleep.\n\nA big advantage of SVGMagic is that you don't have to create multiple versions of your images. You can just focus on the .SVG images and let SVGMagic do the rest.\n\nYou can find more information and demos on [our website](http://svgmagic.bitlabs.nl/).\n\n[![Build Status](https://travis-ci.org/dirkgroenen/SVGMagic.svg?branch=master)](https://travis-ci.org/dirkgroenen/SVGMagic)\n\nSVG... what/why?\n------------\nSVG is a vector graphics format, meaning it's perfectly scalable. Whatever size it needs to display at, or whatever screen it needs to display on, an SVG will adapt perfectly. This means that you can use the same image for desktop and mobile (including Retina) visitors. They all get a perfectly sharp image.\n\nInstallation\n------------\nJust include the script in your header and call the plugin in your ```$(document).ready()```\n```code\n\n\n```\nSVGMagic also supports backgroundimages. You need to parse the div containing the backgroundimage including the ```backgroundimage``` option.\n```code\n\n\n```\n\nOptions\n-------\nYou can parse an options object into SVGMagic. Currently it supports the following options:\n```code\n$('img').svgmagic({\n temporaryHoldingImage: null, // Image that will appear when an image gets converted\n forceReplacements: false, // Force replacement in all browsers\n handleBackgroundImages: false, // Search the dom for CSS background images\n additionalRequestData: {}, // Add extra data to the ajax request.\n postReplacementCallback:null, // Function to run before replacement\n\n // New options\n remoteServerUri: 'https://bitlabs.nl/svgmagic/converter/3/', // Uri of the (remote) API script\n remoteRequestType: 'POST', // Request type for the API call\n remoteDataType: 'jsonp', // Data type for the API call\n debug: 'false' // Show usefull debug information in the console\n});\n```\n\n### additionalRequestData\nThe ```additionalRequestData``` option gives you the posibility to add extra data to the ajax request. The default API script supports two extra options: ```{secure: true}``` and ```{dumpcache: true}```.\n\nLocal development\n-----------------\nSVGMagic needs public access to the images on your website, which means that you can't use it when developing in a local environment. In case you still need to use the plugin you can download the ```converter.php``` script and place it on your local machine.\n\nSupport\n-------\nThe plugin is tested in Internet Explorer Version 7 and 8 (other browsers already support SVG files).\n\nSecurity / How it works\n--------\nThe script makes use of a server side php script that converts the SVG to an PNG. The plugin will send a request to the server containing the images' sources. The server will then grab those images, convert them to PNG, temporarily save them and send the URL of the new images back to the plugin. When the plugin receives the new URL it will replace the .SVG images with the new ones.\n\nThis will only happen when the plugin notices that the user's browser doesn't support SVG images. At the moment IE8 and lower and Android 2.* don't support SVG images.\n\nDemo\n----\nA demo of SVGMagic can be found on the [SVGMagic website](http://svgmagic.bitlabs.nl/).\n\nKnown bugs\n----------\n- When many images need to be replaced the URL can get too long which will result in a server 414 error.\n\nChangelog\n----------\n## 3.0.0 (2014-11-22)\n#### Client:\n New features:\n - SVGMagic can now return usefull debug information while replacing SVG images.\n - Added timeout to ajax request. Show debug information when timeout gets exceeded.\n\n New options:\n - debug: Show usefull debug information in the console\n\n Documentation:\n - Added changelog to the bottom of the README\n - Automatically return images over https when request was over https.\n\n#### Server:\n New features:\n - Fully rewrite of the server script. The server will now provide much more information about the convert process.\n - Response will contain the creation date of cached images.\n - Data images are now also cached.\n\n## 2.4.0 (2014-08-01)\n\n New features:\n - Add extra post data to the ajax request\n - Now also finds data:image SVGs\n\n New options:\n - temporaryHoldingImage: replacement for preloader\n - forceReplacements: replacement for testmode\n - handleBackgroundImages: replacement for backgroundimage\n - additionalRequestData: send extra data to the server that replaces the SVGs for PNGs\n - postReplacementCallback: callback function that will be executed before replacement\n - remoteServerUri: the URI of the remote server that converts the images to PNG\n - remoteRequestType: set the type of the ajax request (post/get)\n - remoteDataType: the datatype sent to and received from the remote server\n\n Deprecated options:\n - preloader > temporaryHoldingImage\n - testmode > forceReplacements\n - backgroundimage > handleBackgroundImages\n - secure > additionalRequestData\n - callback > postReplacementCallback\n - dumpcache > additionalRequestData\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "daniel-lundin/react-dom-confetti", "link": "https://github.com/daniel-lundin/react-dom-confetti", "tags": [], "stars": 597, "description": "Trigger confetti explosions on state transitions", "lang": "JavaScript", "repo_lang": "", "readme": "# react-dom-confetti\n\n[![npm version](https://badge.fury.io/js/react-dom-confetti.svg)](https://www.npmjs.com/package/react-dom-confetti)\n\nTrigger confetti explosions on state transitions:\n\n```js\nimport Confetti from 'react-dom-confetti';\n\n// in render\n \n```\n\nThis component will trigger a confetti explosion every time the prop `active` goes from a falsy to truthy value.\n\n## Demo\n[https://daniel-lundin.github.io/react-dom-confetti/](https://daniel-lundin.github.io/react-dom-confetti/)\n\n## Why?\nSlow operations annoy users and stakeholders. We have two options, either optimize slow operations or **make it worth the wait**. This library focuses on the latter.\n\n### Props\n\n#### active\n\nRequired. Triggers an explosion when the prop transitions from falsy to truthy.\n\n#### config\n\nOptional. Configuration object to control the characteristics of the confetti:\n\n- `angle` - direction of the explosion in degrees, defaults to 90.\n- `spread` - spread of the explosion in degrees, defaults to 45.\n- `startVelocity` - Initial velocity of the particles, defaults to 45.\n- `width`: - width of the confetti elements\n- `height`: - height of the confetti elements\n- `elementCount` - Number of particle elements, defaults to 50.\n- `decay` - *deprecated* - Decrease in velocity per frame, defaults to 0.9 (Use of this will disable dragFriction)\n- `dragFriction` - Decrease in velocity proportional to current velocity, default to 0.1\n- `delay` - *deprecated* Use stagger instead.\n- `stagger` - Delay for each fetti in milliseconds, defaults to 0.\n- `random` - Randomization function, defaults to Math.random\n- `colors` - An array of color codes, defaults to `['#a864fd', '#29cdff', '#78ff44', '#ff718d' '#fdff6a']`\n\nLicense MIT, copyright [Daniel Lundin](https://www.twitter.com/daniel-lundin) 2017\n", "readme_type": "markdown", "hn_comments": "Fun! The confetti effect was one of the first things I learned to do in JS, though it was a few years ago before React. A lot of finding splines manually.Well done!That's great! Just what I need for user oriented milestones :)", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "asseinfo/react-kanban", "link": "https://github.com/asseinfo/react-kanban", "tags": ["react", "reactjs", "kanban-board", "js", "javascript", "kanban", "trello", "hacktoberfest"], "stars": 597, "description": "Yet another Kanban/Trello board lib for React.", "lang": "JavaScript", "repo_lang": "", "readme": "## \ud83d\udd34\ud83d\udd34\ud83d\udd34 THIS PROJECT IS DEPRECATED \ud83d\udd34\ud83d\udd34\ud83d\udd34\n\nUnfortunately this project is deprecated and it is not maintained anymore.\n\nIn the past, our core product was using `react-kanban` and indirectly we were improving this project. But nowadays we replaced it for another internal simplier version.\n\nAs a small dev team, it's impossible for us to continue improving both sources without an external help.\n\nThis source is going to continue available here, but, unfortunately, none effort is going to be done to improve the project.\n\n## react-kanban\n\n[![Test Coverage](https://codecov.io/gh/lourenci/react-kanban/branch/main/graph/badge.svg)](https://codecov.io/gh/lourenci/react-kanban)\n[![Build Status](https://github.com/lourenci/react-kanban/workflows/Test/badge.svg?branch=main)](https://github.com/lourenci/react-kanban/actions?query=branch%3Amain+workflow%3ATest)\n[![JavaScript Style Guide](https://img.shields.io/badge/code_style-standard-brightgreen.svg)](https://standardjs.com)\n\nYet another Kanban/Trello board lib for React.\n\n![Kanban Demo](https://i.imgur.com/yceKUEp.gif)\n\n### \u25b6\ufe0f Demo\n\n[Usage](https://nvjp3.csb.app/)\n\n[![Edit react-kanban-demo](https://codesandbox.io/static/img/play-codesandbox.svg)](https://codesandbox.io/s/react-kanban-demo-nvjp3)\n\n## \u2753 Why?\n\n- \ud83d\udc4a Reliable: 100% tested on CI; 100% coverage; 100% SemVer.\n- \ud83c\udfae Having fun: Play with Hooks \ud83c\udfa3 and ~~Styled Components~~.\n- \u267f\ufe0f Accessible: Keyboard and mobile friendly.\n- \ud83d\udd0c Pluggable: For use in projects.\n\n## \ud83d\udee0 Install and usage\n\nSince this project use Hooks, you have to install them:\n\n- `react>=16.8.5`\n\nAfter, Install the lib on your project:\n\n```bash\nyarn add @asseinfo/react-kanban\n```\n\nImport the lib and use it on your project:\n\n```js\nimport Board from '@asseinfo/react-kanban'\nimport '@asseinfo/react-kanban/dist/styles.css'\n\nconst board = {\n columns: [\n {\n id: 1,\n title: 'Backlog',\n cards: [\n {\n id: 1,\n title: 'Add card',\n description: 'Add capability to add a card in a column'\n },\n ]\n },\n {\n id: 2,\n title: 'Doing',\n cards: [\n {\n id: 2,\n title: 'Drag-n-drop support',\n description: 'Move a card between the columns'\n },\n ]\n }\n ]\n}\n\n \n```\n\n## \ud83d\udd25 API\n\n### \ud83d\udd79 Controlled and Uncontrolled\n\nWhen you need a better control over the board, you should stick with the controlled board.\nA controlled board means you need to deal with the board state yourself, you need to keep the state in your hands (component) and pass this state to the ` `, we just reflect this state.\nThis also means a little more of complexity, although we make available some helpers to deal with the board shape.\nYou can read more in the React docs, [here](https://reactjs.org/docs/forms.html#controlled-components) and [here](https://reactjs.org/docs/uncontrolled-components.html).\n\nIf you go with the controlled one, you need to pass your board through the `children` prop, otherwise you need to pass it through the `initialBoard` prop.\n\n#### Helpers to work with the controlled board\n\nWe expose some APIs that you can import to help you to work with the controlled state. Those are the same APIs we use internally to manage the uncontrolled board. We really recommend you to use them, they are 100% unit tested and they don't do any side effect to your board state.\n\nTo use them, you just need to import them together with your board:\n\n```js\nimport Board, { addCard, addColumn, ... } from '@asseinfo/react-kanban'\n```\n\n**All the helpers you need to pass your board and they will return a new board to pass to your state:**\n\n```js\nimport Board, { addColumn } from '@asseinfo/react-kanban'\n...\nconst [board, setBoard] = useState(initialBoard)\n...\nconst newBoard = addColumn(board, newColumn)\nsetBoard(newBoard)\n...\n{board} \n```\n\n[You can see the list of helpers in the end of the props documentation.](#-helpers-to-be-used-with-an-controlled-board)\n\n### \ud83d\udd37 Shape of a board\n\n```js\n{\n columns: [{\n id: ${unique-required-columnId},\n title: {$required-columnTitle**},\n cards: [{\n id: ${unique-required-cardId},\n title: ${required-cardTitle*}\n description: ${required-description*}\n }]\n }]\n}\n```\n\n\\* The `title` and the `description` are required if you are using the card's default template. You can render your own card template through the [`renderCard`](#rendercard) prop.\n\n\\*\\* The `title` is required if you are using the column's default template. You can render your own column template through the [`renderColumnHeader`](#rendercolumnheader) prop.\n\n### \u2699\ufe0f Props\n\n| Prop | Description | Controlled | Uncontrolled |\n| ----------------------------------------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------- | ---------- | ------------ |\n| [`children`](#children) (required if controlled) | The board to render | \u2705 | \ud83d\udeab |\n| [`initialBoard`](#initialboard) (required if uncontrolled) | The board to render | \ud83d\udeab | \u2705 |\n| [`onCardDragEnd`](#oncarddragend) | Callback that will be called when the card move ends | \u2705 | \u2705 |\n| [`onColumnDragEnd`](#oncolumndragend) | Callback that will be called when the column move ends | \u2705 | \u2705 |\n| [`renderCard`](#rendercard) | A card to be rendered instead of the default card | \u2705 | \u2705 |\n| [`renderColumnHeader`](#rendercolumnheader) | A column header to be rendered instead of the default column header | \u2705 | \u2705 |\n| [`allowAddColumn`](#allowaddcolumn) | Allow a new column be added by the user | \u2705 | \u2705 |\n| [`onNewColumnConfirm`](#onnewcolumnconfirm) (required if use the default column adder template) | Callback that will be called when a new column is confirmed by the user through the default column adder template | \u2705 | \u2705 |\n| [`onColumnNew`](#oncolumnnew) (required if `allowAddColumn` or when [`addColumn`](#rendercolumnadder) is called) | Callback that will be called when a new column is added through the default column adder template | \ud83d\udeab | \u2705 |\n| [`renderColumnAdder`](#rendercolumnadder) | A column adder to be rendered instead of the default column adder template | \u2705 | \u2705 |\n| [`disableColumnDrag`](#disablecolumndrag) | Disable the column move | \u2705 | \u2705 |\n| [`disableCardDrag`](#disablecarddrag) | Disable the card move | \u2705 | \u2705 |\n| [`allowRemoveColumn`](#allowremovecolumn) | Allow to remove a column in default column header | \u2705 | \u2705 |\n| [`onColumnRemove`](#oncolumnremove) (required if `allowRemoveColumn` or when [`removeColumn`](#rendercolumnheader) is called) | Callback that will be called when a column is removed | \u2705 | \u2705 |\n| [`allowRenameColumn`](#allowrenamecolumn) | Allow to rename a column in default column header | \u2705 | \u2705 |\n| [`onColumnRename`](#oncolumnrename) (required if `allowRenameColumn` or when [`renameColumn`](#rendercolumnheader) is called) | Callback that will be called when a column is renamed | \u2705 | \u2705 |\n| [`allowRemoveCard`](#allowremovecard) | Allow to remove a card in default card template | \u2705 | \u2705 |\n| [`onCardRemove`](#oncardremove) (required if `allowRemoveCard`) | Callback that will be called when a card is removed | \u2705 | \u2705 |\n| [`allowAddCard`](#allowaddcard) | Allow to add a card. Expect an object with the position to add the card in the column. | \ud83d\udeab | \u2705 |\n| [`onCardNew`](#oncardnew) (required if `allowAddCard` or when [`addCard`](#rendercolumnheader) is called) | Callback that will be called when a new card is added through the default card adder template | \ud83d\udeab | \u2705 |\n| [`onNewCardConfirm`](#onnewcardconfirm) (required if `allowAddCard`) | Callback that will be called when a new card is confirmed by the user through the default card adder template | \ud83d\udeab | \u2705 |\n\n#### `children`\n\nThe board. Use this prop if you want to control the board's state.\n\n#### `initialBoard`\n\nThe board. Use this prop if you don't want to control the board's state.\n\n#### `onCardDragEnd`\n\nWhen the user moves a card, this callback will be called passing these parameters:\n\n| Arg | Description |\n| ------------- | ---------------------------------------------------------------- |\n| `board` | The modified board |\n| `card` | The moved card |\n| `source` | An object with the card source `{ fromColumnId, fromPosition }` |\n| `destination` | An object with the card destination `{ toColumnId, toPosition }` |\n\n##### Source and destination\n\n| Prop | Description |\n| -------------- | ------------------------------------------- |\n| `fromColumnId` | Column source id. |\n| `toColumnId` | Column destination id. |\n| `fromPosition` | Card's index in column source's array. |\n| `toPosition` | Card's index in column destination's array. |\n\n#### `onColumnDragEnd`\n\nWhen the user moves a column, this callback will be called passing these parameters:\n\n| Arg | Description |\n| ------------- | ------------------------------------------------------ |\n| `board` | The modified board |\n| `column` | The moved column |\n| `source` | An object with the column source `{ fromPosition }` |\n| `destination` | An object with the column destination `{ toPosition }` |\n\n##### Source and destination\n\n| Prop | Description |\n| -------------- | ------------------------------- |\n| `fromPosition` | Column index before the moving. |\n| `toPosition` | Column index after the moving. |\n\n#### `renderCard`\n\nUse this if you want to render your own card. You have to pass a function and return your card component.\nThe function will receive these parameters:\n\n| Arg | Description |\n| --------- | ---------------------------------------------------------------- |\n| `card` | The card props |\n| `cardBag` | A bag with some helper functions and state to work with the card |\n\n##### `cardBag`\n\n| function | Description |\n| ------------- | ----------------------------------------------------- |\n| `removeCard*` | Call this function to remove the card from the column |\n| `dragging` | Whether the card is being dragged or not |\n\n\\* It's unavailable when the board is controlled.\n\nEx.:\n\n```js\nconst board = {\n columns: [{\n id: ${unique-required-columnId},\n title: ${columnTitle},\n cards: [{\n id: ${unique-required-cardId},\n dueDate: ${cardDueDate},\n content: ${cardContent}\n }]\n }]\n}\n\n (\n \n {content}\n Remove Card \n \n )}\n>\n{board}\n \n```\n\n#### `renderColumnHeader`\n\nUse this if you want to render your own column header. You have to pass a function and return your column header component.\nThe function will receive these parameters:\n\n| Arg | Description |\n| ----------- | -------------------------------------------------------- |\n| `column` | The column props |\n| `columnBag` | A bag with some helper functions to work with the column |\n\n##### `columnBag`\n\n| function | Description |\n| --------------- | ---------------------------------------------------------- |\n| `removeColumn*` | Call this function to remove the column from the board |\n| `renameColumn*` | Call this function with a title to rename the column |\n| `addCard*` | Call this function with a new card to add it in the column |\n\n**`addCard`**: As a second argument you can pass an option to define where in the column you want to add the card:\n\n- `{ on: 'top' }`: to add on the top of the column.\n- `{ on: 'bottom' }`: to add on the bottom of the column (default).\n\n\\* It's unavailable when the board is controlled.\n\nEx.:\n\n```js\nconst board = {\n columns: [{\n id: ${unique-required-columnId},\n title: ${columnTitle},\n wip: ${wip},\n cards: [{\n id: ${unique-required-cardId},\n title: ${required-cardTitle},\n description: ${required-cardDescription}\n }]\n }]\n}\n\n (\n \n {title}\n Remove Column \n renameColumn('New title')}>Rename Column \n addCard({ id: 99, title: 'New Card' })}>Add Card \n \n {board}\n \n```\n\n#### `allowAddColumn`\n\nAllow the user to add a new column directly by the board.\n\n#### `onNewColumnConfirm`\n\nWhen the user confirms a new column through the default column adder template, this callback will be called with a draft of a column with the title typed by the user.\n\nIf your board is uncontrolled you **must** return the new column with its new id in this callback.\n\nIf your board is controlled use this to get the new column title.\n\nEx.:\n\n```js\nfunction onColumnNew (newColumn) {\n const newColumn = { id: ${required-new-unique-columnId}, ...newColumn }\n return newColumn\n}\n\n \n```\n\n#### `onColumnNew`\n\nWhen the user adds a new column through the default column adder template, this callback will be called passing the updated board and the new column.\n\nThis callback will not be called in an uncontrolled board.\n\n#### `renderColumnAdder`\n\nUse this if you want to render your own column adder. You have to pass a function and return your column adder component.\nThe function will receive these parameters:\n\n| Arg | Description |\n| ----------- | -------------------------------- |\n| `columnBag` | A bag with some helper functions |\n\n##### `columnBag`\n\n| function | Description |\n| ------------ | ---------------------------------------------------------- |\n| `addColumn*` | Call this function with a new column to add the new column |\n\n\\* It's unavailable when the board is controlled.\n\nEx.:\n\n```js\nconst ColumnAdder = ({ addColumn }) {\n return (\n addColumn({id: ${required-new-unique-columnId}, title: 'Title', cards:[]})}>\n Add column\n
\n )\n}\n\n }\n {board}\n \n```\n\n#### `disableColumnDrag`\n\nDisallow the user from move a column.\n\n#### `disableCardDrag`\n\nDisallow the user from move a card.\n\n#### `allowRemoveColumn`\n\nWhen using the default header template, when you don't pass a template through the `renderColumnHeader`, it will allow the user to remove a column.\n\n#### `onColumnRemove`\n\nWhen the user removes a column, this callback will be called passing these parameters:\n\n| Arg | Description |\n| -------- | ------------------------------------ |\n| `board` | The board without the removed column |\n| `column` | The removed column |\n\n#### `allowRenameColumn`\n\nWhen using the default header template, when you don't pass a template through the `renderColumnHeader`, it will allow the user to rename a column.\n\n#### `onColumnRename`\n\nWhen the user renames a column, this callback will be called passing these parameters:\n\n| Arg | Description |\n| -------- | --------------------------------- |\n| `board` | The board with the renamed column |\n| `column` | The renamed column |\n\n#### `allowRemoveCard`\n\nWhen using the default card template, when you don't pass a template through the `renderCard`, it will allow the user to remove a card.\n\n#### `onCardRemove`\n\nWhen the user removes a card, this callback will be called passing these parameters:\n\n| Arg | Description |\n| -------- | ------------------------------------ |\n| `board` | The board without the removed column |\n| `column` | The column without the removed card |\n| `card` | The removed card |\n\n#### `allowAddCard`\n\nAllow the user to add a card in the column directly by the board. By default, it adds the card on the bottom of the column, but you can specify whether you want to add at the top or at the bottom of the board by passing an object with 'on' prop.\n\nE.g.:\n // at the bottom by default\n // in the bottom of the column\n // at the top of the column\n\n### \ud83d\udd29 Helpers to be used with an controlled board\n\n#### `moveColumn`\n\n| Arg | Description |\n| ------------------ | --------------------------------------- |\n| `board` | Your board |\n| `{ fromPosition }` | Index of column to be moved |\n| `{ toPosition }` | Index destination of column to be moved |\n\n#### `moveCard`\n\nUse this on a controlled board, the \"from\" and \"to\" are the same ones passed to onCardDragEnd callback. You can used this within your onCardDragEnd call back to actually update your board as it will return a new board which you can save down into state.\n\n| Arg | Description |\n| -------------------------------- | ------------------------------------------ |\n| `board` | Your board |\n| `{ fromPosition, fromColumnId }` |An object with the card source `{ fromColumnId, fromPosition }` which are the indexes of the cards current position |\n| `{ toPosition, toColumnId }` | An object with the card destination `{ fromColumnId, fromPosition }` which are the indexes of the cards new position |\n\n#### `addColumn`\n\n| Arg | Description |\n| -------- | ------------------ |\n| `board` | Your board |\n| `column` | Column to be added |\n\n#### `removeColumn`\n\n| Arg | Description |\n| -------- | -------------------- |\n| `board` | Your board |\n| `column` | Column to be removed |\n\n#### `changeColumn`\n\n| Arg | Description |\n| -------- | ------------------------------------------------------------------------------------------------- |\n| `board` | Your board |\n| `column` | Column to be renamed |\n| `object` | Pass a object to be merged with the column. You can add new props and/or change the existing ones |\n\n#### `addCard`\n\n| Arg | Description |\n| ---------------------- | ----------------------------------------------------------------------------------- |\n| `board` | Your board |\n| `inColumn` | Column to add the card be added |\n| `card` | Card to be added |\n| `{ on: 'bottom|top' }` | Whether the card will be added on top or bottom of the column (`bottom` is default) |\n\n#### `changeCard`\n\n| Arg | Description |\n| -------- | ------------------------------------------------------------------------------------------------- |\n| `board` | Your board |\n| `cardId` | Card's id to be changed\n| `object` | Pass a object to be merged with the card. You can add new props and/or change the existing ones |\n\n#### `onCardNew`\n\nWhen the user adds a new card through the default card adder template, this callback will be called passing the updated board and the new card.\n\n#### `onNewCardConfirm`\n\nWhen the user confirms a new card through the default card adder template, this callback will be called with a draft of a card with the title and the description typed by the user.\n\nYou **must** return the new card with its new id in this callback.\n\nEx.:\n\n```js\nfunction onCardNew (newCard) {\n const newCard = { id: ${required-new-unique-cardId}, ...newCard }\n return newCard\n}\n\n \n```\n\n#### `removeCard`\n\n| Arg | Description |\n| ------------ | ------------------------ |\n| `board` | Your board |\n| `fromColumn` | Column where the card is |\n| `card` | Card to be removed |\n\n## \ud83d\udc85\ud83c\udffb Styling\n\nYou can either style all the board or import our style and override it with the styles you want:\n\n| Class |\n| ----- |\n| `react-kanban-board` |\n| `react-kanban-card` |\n| `react-kanban-card-skeleton` |\n| `react-kanban-card--dragging` |\n| `react-kanban-card__description` |\n| `react-kanban-card__title` |\n| `react-kanban-column` |\n| `react-kanban-card-adder-form` |\n| `react-kanban-card-adder-button` |\n| `react-kanban-card-adder-form__title` |\n| `react-kanban-card-adder-form__description` |\n| `react-kanban-card-adder-form__button` |\n| `react-kanban-column-header` |\n| `react-kanban-column-header__button` |\n| `react-kanban-column-adder-button` |\n\n## \ud83e\uddea Tests\n\n### Unit\n\n```shell\nyarn test\n```\n\nCode coverage is saved in `coverage` folder. Open HTML report for example with\n\n```shell\nopen coverage/lcov-report/index.html\n```\n\n### End-to-end\n\nUsing [Cypress](https://www.cypress.io) test runner. Start dev server and open Cypress using\n\n```shell\nyarn dev\n```\n\nAll tests are in the [cypress/integration](cypress/integration) folder. These tests also collect code coverage and save in several formats in the `coverage` folder. Open HTML report\n\n```shell\nopen coverage/lcov-report/index.html\n```\n\nRead [Cypress code coverage guide](https://on.cypress.io/code-coverage)\n\nNote: to avoid inserting `babel-plugin-istanbul` twice during Jest tests, E2E tests run with `NODE_ENV=cypress` environment variable. The `babel-plugin-istanbul` plugin is included in [.babelrc](.babelrc) file only in the `cypress` Node environment, leaving the default Jest configuration during `NODE_ENV=test` the same.\n\n## \ud83d\udeb4\u200d\u2640\ufe0f Roadmap\n\nYou can view the next features [here](https://github.com/lourenci/react-kanban/milestone/1).\nFeel welcome to help us with some PRs.\n\n## \ud83e\udd1d Contributing\n\nPRs are welcome:\n\n- Fork this project.\n- Setup it:\n ```\n yarn\n yarn start\n ```\n- Make your change.\n- Please add yourself to the contributors table (we use [all contributors](https://allcontributors.org/docs/en/cli/installation) for that, we you will need that installed first):\n ```\n yarn contributors:add\n ```\n- Open the PR.\n\n### \u270d\ufe0f Guidelines for contributing\n\n- You need to test your change.\n- Try to be clean on your change. CodeClimate will keep an eye on you.\n- It has to pass on CI.\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "kenzok8/small-package", "link": "https://github.com/kenzok8/small-package", "tags": [], "stars": 598, "description": "\u81ea\u52a8\u540c\u6b65\u66f4\u65b0\u4e0a\u6e38\u5e93\u8f6f\u4ef6", "lang": "JavaScript", "repo_lang": "", "readme": "![kenzo github stats](https://github-readme-stats.vercel.app/api?username=kenzok8&show_icons=true&theme=merko)\n\n
\u540c\u6b65\u4e0a\u6e38\u5206\u652f\u4ee3\u7801 \n
\n
\n
\n
\n
\n
\n\n\n#### small-package\n\n* \u5e38\u7528OpenWrt\u8f6f\u4ef6\u5305\u6e90\u7801\u5408\u96c6\uff0c\u540c\u6b65\u4e0a\u6e38\u66f4\u65b0\uff01\n\n* \u901a\u7528\u7248luci\u9002\u540818.06\u4e0e19.07\n\n* \u5173\u4e8e\u6709\u597d\u7684\u63d2\u4ef6\u8bf7\u5728issues\u63d0\u4ea4\n\n* \u611f\u8c22\u4ee5\u4e0agithub\u4ed3\u5e93\u6240\u6709\u8005\uff01\n\n##### \u63d2\u4ef6\u4e0b\u8f7d:\n\n![GitHub release (latest by date)](https://img.shields.io/github/v/release/kenzok8/compile-package?style=for-the-badge&label=\u63d2\u4ef6\u66f4\u65b0\u4e0b\u8f7d)\n\n##### \u5173\u4e8eSecrets\u3001TOKEN\u7684\u5c0f\u77e5\u8bc6\n\n\n1. \u9996\u5148\u9700\u8981\u83b7\u53d6 **Github Token**: [\u70b9\u51fb\u8fd9\u91cc](https://github.com/settings/tokens/new) \u83b7\u53d6,\n\n `Note`\u9879\u586b\u5199\u4e00\u4e2a\u540d\u79f0,`Select scopes`\u4e0d\u7406\u89e3\u5c31**\u5168\u90e8\u6253\u52fe**,\u64cd\u4f5c\u5b8c\u6210\u540e\u70b9\u51fb\u4e0b\u65b9`Generate token`\n\n2. \u590d\u5236\u9875\u9762\u4e2d\u751f\u6210\u7684 **Token**,\u5e76\u4fdd\u5b58\u5230\u672c\u5730,**Token \u53ea\u4f1a\u663e\u793a\u4e00\u6b21!**\n\n3. **Fork** \u6211\u7684`small-package`\u4ed3\u5e93,\u7136\u540e\u8fdb\u5165\u4f60\u7684`small-package`\u4ed3\u5e93\u8fdb\u884c\u4e4b\u540e\u7684\u8bbe\u7f6e\n\n4. \u70b9\u51fb\u4e0a\u65b9\u83dc\u5355\u4e2d\u7684`Settings`,\u4f9d\u6b21\u70b9\u51fb`Secrets`-`New repository secret`\n\n\u5176\u4e2d`Name`\u9879\u586b\u5199`ACCESS_TOKEN`,\u7136\u540e\u5c06\u4f60\u7684 **Token** \u7c98\u8d34\u5230`Value`\u9879,\u5b8c\u6210\u540e\u70b9\u51fb`Add secert`\n\n* \u5bf9\u5e94`.github/workflows`\u76ee\u5f55\u4e0b\u7684`yml`\u5de5\u4f5c\u6d41\u6587\u4ef6\u91cc\u7684`ACCESS_TOKEN`\u540d\u79f0\uff08\u4f9d\u636e\u81ea\u5df1yml\u6587\u4ef6\u4fee\u6539\uff09\n\n* \u5728\u4ed3\u5e93`Settings->Secrets`\u4e2d\u6dfb\u52a0 `SCKEY `\u53ef\u901a\u8fc7[Server\u9171](http://sc.ftqq.com) \u63a8\u9001\u7f16\u8bd1\u7ed3\u679c\u5230\u5fae\u4fe1\n\n* \u5728\u4ed3\u5e93`Settings->Secrets`\u4e2d\u6dfb\u52a0 `TELEGRAM_CHAT_ID, TELEGRAM_TOKEN `\u53ef\u63a8\u9001\u7f16\u8bd1\u7ed3\u679c\u5230`Telegram Bot`\n\n\n\n\n#### \u4f7f\u7528\u65b9\u5f0f\uff08\u4e09\u9009\u4e00\uff09\uff1a\n\n1. \u5148cd\u8fdbpackage\u76ee\u5f55\uff0c\u7136\u540e\u6267\u884c\n\n```bash\n git clone https://github.com/kenzok8/small-package\n```\n2. \u6216\u8005\u6dfb\u52a0\u4e0b\u9762\u4ee3\u7801\u5230feeds.conf.default\u6587\u4ef6\n\n```bash\n src-git small8 https://github.com/kenzok8/small-package\n```\n3. lede/\u4e0b\u8fd0\u884c \u6216\u8005openwrt/\u4e0b\u8fd0\u884c\n\n```bash\ngit clone https://github.com/kenzok8/small-package package/small-package\n```\n\n\n\n\n\n\n\n\n\n\n\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "jolaleye/horizon-theme-vscode", "link": "https://github.com/jolaleye/horizon-theme-vscode", "tags": ["vscode", "theme", "vscode-theme", "dark-theme"], "stars": 597, "description": ":art: A beautifully warm dual theme for Visual Studio Code", "lang": "JavaScript", "repo_lang": "", "readme": "\n \n
\n\n---\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n
\n\n\n \n
\n\n## Installation\n\n1. Open the **Extensions** sidebar in VS Code\n2. Search for `Horizon Theme`\n3. Click **Install**\n4. Open the **Command Palette** with `Ctrl+Shift+P` or `\u21e7\u2318P`\n5. Select **Preferences: Color Theme** and choose a Horizon variant.\n6. Enjoy! \ud83c\udf89 Also, check out some of the personalization options below...\n\n## Personalization\n\nTastes change all the time. Fortunately, VS Code makes it easy to customize just about every aspect of your editor.\nIf you want to change something, open the **Command Palette** and select **Preferences: Open Settings (JSON)**. Here, you can override VS Code's defaults or Horizon's colors.\nCheck out some of the personalization options below to customize Horizon to suit your taste.\n\n_For more info on theming, visit the [Theme Authoring Guide](https://code.visualstudio.com/api/extension-capabilities/theming) and [Theme Color Reference](https://code.visualstudio.com/api/references/theme-color)._\n\n### Contrast\n\nTo add a border between sections of the editor, add the following to your settings...\n\n```\n\"workbench.colorCustomizations\": {\n \"contrastBorder\": \"#16161C\"\n}\n```\n\nOr for Bright variants...\n\n```\n\"workbench.colorCustomizations\": {\n \"contrastBorder\": \"#1A1C231A\"\n}\n```\n\n### Italics\n\nThe normal theme only uses italics in a few places. If you would prefer no italics at all, you can configure this in your settings...\n\n```\n\"editor.tokenColorCustomizations\": {\n \"textMateRules\": [\n {\n \"name\": \"No italics\",\n \"scope\": [\"comment\", \"markup.quote\", \"variable.language\", \"variable.parameter\"],\n \"settings\": {\n \"fontStyle\": \"normal\"\n }\n }\n ]\n}\n```\n\n### Tag Brackets `<>`\n\nFor gray rather than red brackets around HTML tags...\n\n```\n\"editor.tokenColorCustomizations\": {\n \"textMateRules\": [\n {\n \"name\": \"Tag brackets\",\n \"scope\": [\"punctuation.definition.tag\"],\n \"settings\": {\n \"foreground\": \"#BBBBBB\"\n }\n }\n ]\n}\n```\n\n## Contributing\n\nCheck out the [contributing guide](CONTRIBUTING.md) to learn how you can report issues and help make changes.\n\nAlways be sure to follow the [Code of Conduct](CODE_OF_CONDUCT.md).\n\n## License\n\n[MIT](LICENSE) \u00a9 [Jonathan Olaleye](https://github.com/jolaleye)\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "dickrnn/dickrnn.github.io", "link": "https://github.com/dickrnn/dickrnn.github.io", "tags": [], "stars": 597, "description": "a recurrent network trained to draw dicks", "lang": "JavaScript", "repo_lang": "", "readme": "# DICK-RNN\n\nA recurrent neural network trained to draw dicks.\n\nDemo: https://dickrnn.github.io/\n\nGitHub: https://github.com/dickrnn/dickrnn.github.io/\n\n \n\nThis project is based on the original [sketch-rnn demo](https://magenta.tensorflow.org/assets/sketch_rnn_demo/index.html), and is a fork of [sketch-rnn-js](https://github.com/tensorflow/magenta-demos/tree/master/sketch-rnn-js/README.md), but customized for dicks.\n\nThe methodology is described in this paper: https://arxiv.org/abs/1704.03477\n\nDataset used for training is based on [Quickdraw-appendix](https://github.com/studiomoniker/Quickdraw-appendix).\n\n## Media Coverage\n\n*\u201cMean Time To Dick is a key measure of any given human or machine intelligence system.\u201d* \u2014 [Elon Musk](https://twitter.com/elonmusk/status/1253834316242616328), on dick-rnn.\n\nReddit: dick-rnn discussions on [/r/MachineLearning](https://redd.it/g6og9l) and [r/javascript](https://redd.it/g6opsc).\n\nArticles about dick-rnn around the world, in [The Next Web](https://thenextweb.com/neural/2020/04/24/neural-network-draw-dicks/), [Boing Boing](https://boingboing.net/2020/04/24/this-ais-only-function-is-to.html), [PC Gamer](https://www.pcgamer.com/someone-taught-an-ai-to-draw-dicks-after-feeding-it-25000-doodles-of-penises/), [Mashable](https://mashable.com/article/dickrnn-dick-doodle-neural-network-ai/), [Dlisted](https://dlisted.com/2020/04/26/hot-slut-of-the-day-2278/), [9GAG](https://9gag.com/gag/a0NvxQn/you-can-train-an-ai-to-draw-a-penis), [C\u00f3digo Espagueti](https://codigoespagueti.com/noticias/tecnologia/red-neuronal-10000-penes-dibujar/) and [HD Tecnologia](https://www.hd-tecnologia.com/alguien-enseno-a-una-ia-como-dibujar-penes-suministrandole-25-000-dibujos-distintos/) (Spanish), [Feber](https://feber.se/internet/ai-larde-sig-rita-penis/410425/) (Swedish), [dobreprogramy](https://www.dobreprogramy.pl/Wykorzystali-sztuczna-inteligencje-do-rysowania-penisow,News,107621.html) (Polish), [futuretech](https://futurezone.at/digital-life/kuenstliche-intelligenz-kritzelt-penisse/400824998) (Austrian German), [4Gamers](https://www.4gamers.com.tw/news/detail/42931/someone-taught-an-ai-to-draw-dicks-after-feeding-it-25000-doodles-of-penises) (Traditional Chinese), [Gigazine](https://gigazine.net/news/20200426-dickrnn/) and [Karapaia](http://karapaia.com/) (Japanese).\n\n# Why?\n\nFrom Studio Moniker's [Quickdraw-appendix](https://studiomoniker.com/projects/do-not-draw-a-penis) project:\n\n*In 2018 Google open-sourced the [Quickdraw data set](https://github.com/googlecreativelab/quickdraw-dataset). \u201cThe world's largest doodling data set\u201d. The set consists of 345 categories and over 50 million drawings. For obvious reasons the data set was missing a few specific categories that people seem to enjoy drawing. This made us at Moniker think about the moral reality big tech companies are imposing on our global community and that most people willingly accept this. Therefore we decided to publish an appendix to the Google Quickdraw data set.*\n\nI also believe that [\u201cDoodling a penis is a light-hearted symbol for a rebellious act\u201d](https://www.theverge.com/tldr/2019/6/17/18681733/google-ai-doodle-detector-penis-protest-moniker-mozilla) and also \u201cthink our moral compasses should not be in the hands of big tech\u201d.\n\n# Dick Demos\n\n[Main Dick Demo](https://dickrnn.github.io/)\n\n[Predict Multiple Dicks](https://dickrnn.github.io/multi.html)\n\n[Predict Single Dick](https://dickrnn.github.io/predict.html)\n\n[Simple Dick Demo](https://dickrnn.github.io/simple.html)\n\n## Example Dicks from Main Demo\n\nThe dicks are embedded in the query string after `share.html`.\n\nExamples of sharable generated dick doodles:\n\n\n \n
\n\n# Dataset\n\nThis recurrent neural network was trained on a [dataset](https://github.com/studiomoniker/Quickdraw-appendix) of roughly 10,000 dick doodles.\n\nThe [Quickdraw-appendix](https://github.com/studiomoniker/Quickdraw-appendix) dataset was processed via incremental RDP epsilons to fit most dicks within 200 steps. Note that I used the raw version, not their simplified version, since the dicks were more detailed. The processed dataset that is compatable with [sketch-rnn](https://github.com/hardmaru/sketch-rnn-datasets/)'s strokes (no pun) is in this repo as `dataset/dicks.npz` and can be loaded this way:\n\n```python\nfilename = \"dataset/dicks.npz\"\nload_data = np.load(filename)\ntrain_set = load_data['train']\nvalid_set = load_data['valid']\ntest_set = load_data['test']\n\nprint(len(train_set))\n> Output: 9500\n\nprint(len(valid_set))\n> Output: 496\n\nprint(len(test_set))\n> Output: 496\n \n# draw a random example (see draw_strokes.py)\ndraw_strokes(random.choice(train_set), factor=0.5)\n```\n\nTraining samples from the dataset:\n\n \n\nFor best results, train with default [sketch-rnn](https://github.com/tensorflow/magenta/tree/master/magenta/models/sketch_rnn) settings, but use a dropout keep probability of 80%. Early stopping was performed on the validation set. To maximize samples used for training/validation, no test set is used, and the test set is just set to the same 496 validation samples to be compatable with the data format expected by the existing code.\n\nCommand used to train the TensorFlow [sketch-rnn](https://github.com/tensorflow/magenta/tree/master/magenta/models/sketch_rnn) model:\n\n```\npython sketch_rnn_train.py --data_dir=dataset --gpu=0 --log_root=log --hparams=data_set=['dicks.npz'],num_steps=1000000,conditional=0,dec_rnn_size=512,recurrent_dropout_prob=0.8\n```\n\nI found a [Jupyter notebook](https://github.com/magenta/magenta-demos/blob/master/jupyter-notebooks/Sketch_RNN_TF_To_JS_Tutorial.ipynb) in the [sketch-rnn repo](https://github.com/magenta/magenta-demos) that easily converted the TensorFlow checkpoint into the `.gen.full.json` format that `sketch-rnn-js` can use, with the command:\n\n```\nnode compress_model.js custom.gen.full.json dick.gen.js\n```\n\n*__Update (4/27/2020)__ The Quickdraw-appendix dataset was updated, and there are now 25K examples, up from the earlier 10K. I processed the newer dataset as `dicksv2.npz` with a proper train/valid/test split of 23000/1500/693 samples. The [Main Demo](https://dickrnn.github.io/) has been updated to use a larger, but slightly slower model trained on the revised dataset containing more training examples.*\n\n# License\n\nOriginal license for each file is indicated in the header comment for each source file, or referenced in the URL stated in the source file. Creative Commons License for datasets also indicated accordingly.\n\nMIT License for my specific additional work.\n", "readme_type": "markdown", "hn_comments": "This is hilarious! So is your username!Browser Demo: https://dickrnn.github.io/", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "bitpay/bitcore-wallet-service", "link": "https://github.com/bitpay/bitcore-wallet-service", "tags": [], "stars": 597, "description": "A multisig, HD Bitcoin and Bitcoin Cash wallet service. Used by Copay.", "lang": "JavaScript", "repo_lang": "", "readme": "\n# bitcore-wallet-service\n\n\nTHIS REPO HAVE BEEN MOVED TO BITCORE's MONO REPO. Check: \nhttps://github.com/bitpay/bitcore/tree/master/packages/bitcore-wallet-service\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "CindyJS/CindyJS", "link": "https://github.com/CindyJS/CindyJS", "tags": ["geometry", "mathematics", "javascript"], "stars": 596, "description": "A JavaScript framework for interactive (mathematical) content.", "lang": "JavaScript", "repo_lang": "", "readme": "# CindyJS\n\n**CindyJS is a framework to create interactive\n(mathematical) content for the web.**\n\nIt aims to be compatible with [Cinderella](http://cinderella.de/),\nproviding an interpreter for the scripting language CindyScript\nas well as a set of geometric operations which can be used to describe\nconstructions.\nTogether, these components make it very easy to visualize various\nconcepts, from geometry in particular and mathematics in general,\nbut also from various other fields.\n\nSee also our [project page](https://cindyjs.org).\n\n## Examples\n\nExamples on the web can be seen [here](https://cindyjs.org/gallery/main/).\n\nThere is also [an `examples` directory](https://github.com/CindyJS/CindyJS/tree/master/examples)\ninside the repository, demonstrating individual functions and operations.\n\nDevelopers can run these examples from their local development copy.\nSome examples may require a webserver-like environment to avoid\ntriggering browser security measures associated with local files.\nTo do so, one can run node_modules/.bin/st -l -nc
\nin the root of the development tree, and then visit\n[the local copy of the examples directory](http://127.0.0.1:1337/examples/).\n\n## Building\n\nIf you have `npm` installed, running `npm install`\nin the top level source directory should just work.\n\nIf you lack a compatible setup of `npm` and `node`,\nrunning `make build=release` in the top level source directory should\nbe able to get a suitable setup installed inside the project directory tree.\nIn general, all required third-party tools should be automatically downloaded\nand installed inside the project directory tree.\nOne exception is a Java Runtime Environment, which has to be installed before\n(because users have to manually accept the terms and conditions before\nbeing allowed to download a JRE).\n\nIf `npm` resp. `make` terminated successfully, then `build/js` will contain\nthe artefacts which you'd likely want to include in your web site.\nIf you are building from an official commit, then `make build=release deploy`\nwill create `build/deploy` which is even better suited to be put on a web server,\nsince it references the commit at GitHub which may help diagnose problems.\n\n### Building on Windows\n\nThe description above uses `make` mostly for convenience.\nPretty much all the commands are in fact passed on to\na JavaScript-based build system contained in the `make` directory.\nIf you don't have `make` available on Windows,\nyou can call `node make` instead.\nSo a standard release build would be `node make build=release`.\n\nNote that you should have the following software installed:\n\n- A recent Java Runtime Environment (JRE)\n- Node.js with the `node` command added to the PATH\n- Git for Windows with the `git` command usable from the Windows Command Prompt\n\n## Contributing\n\nWhen you work on the code base the simple `make` or `node make`\nwill give you a build which is fast to compile and easy to debug.\nIn contrast to this, `node make build=release` will\nperform additional compilation steps like running the Closure Compiler.\nIt may issue more warnings, which in turn might be useful when developing.\nYou should make sure that your code works in both build modes.\n\nIf you are confident that your work is done, call `make alltests`\nafter you did `git add` to stage your changes.\nThat will ensure that your modifications pass all kinds of tests.\nThe same tests will be run automatically on pull requests.\nOnce your modifications satisfy your expectations, pass these tests\nand are accompanied by a suitable test case or demonstrating example\n(where appropriate), you may file a pull request for your changes.\n\n## Documentation\n\n[The CindyJS API documentation](https://github.com/CindyJS/CindyJS/blob/master/ref/createCindy.md)\ndescribes how to create a widget on an HTML page using this framework.\n\nOther documentation in [the `ref` directory](https://github.com/CindyJS/CindyJS/tree/master/ref) describes\nlarge portions of the CindyScript programming language. This\ndocumentation, however, started as a copy of\n[the corresponding Cinderella documentation](http://doc.cinderella.de/tiki-index.php?page=CindyScript). It\nis currently meant as a goal of what functionality _should_ be\nsupported, while actual support might still be lagging behind. If there\nis a particular feature you'd need for your work, don't hesitate to\n[file a feature request](https://github.com/CindyJS/CindyJS/issues) for it.\n\n## License\n\nCindyJS is licensed under the\n[Apache 2 license](http://www.apache.org/licenses/LICENSE-2.0.html).\n", "readme_type": "markdown", "hn_comments": "You might wonder why there are only 2 members and that is because this was just announced. Chris Fritz used to live in the Lansing area and is the guy who turned me on to Vue.He did such a superb job on the Vue docs that he serves as an inspiration for my Code For America project documentation efforts. Burnout is real whether it's on a startup or an open source project.50+ SPEAKERS, 7 WORKSHOPS, VIRTUAL & INTERACTIVEJOIN US for a JS World Festival with 50 and more speakers across the world. 2 Days of workshops in TypeScript, NodeJS and Svelte. Also 3 Full Days with 50 speakers, library contributors, maintainers such as Natalia Tepluhina (Vue Core Team), Gleb Bahmutov (Cypress.io), Nader Dabit (AWS, React Native Elements Contributor), and Matteo Colina (NodeJS TSC member).It's always good to see someone take a big swing and try something new, but I'd love to hear the VC case for this service. I have trouble believing that 1) many people will want this 2) churn rate won't be murderous.Why not video?Hey Ross,Love this idea! How are you thinking about sharing Artifacts with less technically inclined recipients?I'd love to give this as a gift to my parents but I've unsuccessful tried a few times to get them set up with listening to podcasts.uselessI tried to do this manually over Zoom with my parents at the start of the pandemic.My mom focused on walking through the family tree and dad keeps putting it off.This will for sure be a Christmas gift. Amazing idea.My Dad was a sound engineer. When I was 9 or 10 I interviewed my neighbour about life in London during the Blitz in WWII for a school project. We recorded the audio, edited it (on DAT!), and added some background effects for atmosphere.It was brilliant: everyone loved it and I never got a better grade for any homework again.So yeah, this could work. Good luck!Had a similar idea recently and I'm so happy someone has taken this to fruition! Would love to know when you guys can do the interviews in Spanish since I'd want to get a few of my parents and grandparents stories.This is a lovely idea!Have you heard of the BBC's \"The Listening Project\"? It's a partnership between BBC and The British Library. https://www.bbc.co.uk/programmes/b01cqx3b> Capturing the nation in conversation> Since 2012 we have been collecting intimate conversations between friends or relatives, to build a unique picture of our lives today. We have collected over a thousand so far, and most will be broadcast across BBC radio, while all are archived by the British Library, preserving them for future generations.I'd like to hear the creations as a type of \"This American Life\" series. Is there a way to subscribe to Artifact as a podcast?This is interesting. I use soundcloud for this today. I have a few recordings of my family talking about the Great Depression, and some USO WW2 recordings from my Great Grandpa. Not sure If I would pay for this, since soundcloud is free - but I generally like this ideaI had a couple of good conversations with George. This is great example of how the team informs the product. Great work Artifact team!My sister and I interviewed our grandfather before he passed, after being inspired by a friend who does this privately as an offline, local service. I'm so pleased to see this, it's heartwarming and so powerful. Having our grandfather's voice is priceless. Plus the stories we learned that otherwise would have remained with him!Question: my friend would be perfect for this as a job for her, how might she go about applying?As an aside could you also use it to write biographies? Biography-as-a-service I imagine might go down well..This is a great idea, great product. Pricing is very reasonable. I'll recommend thisVery cool!Have you heard of StoryCorp? [0] It's a nonprofit which sets up booths and recording equipment around the world, then records and transcribes intimate conversations between individuals and catalogues those stories in the Library of Congress so they can be found and remembered.They have hundreds of thousands of conversations at this point. Pretty remarkable initiative, similar in philosophy to yours. Dave Isay, the founder, describes it as a \"hope machine\", kind of the opposite of reality TV.StoryCorps also produces these 3-minute summaries of some of their more notable conversations, which cover the whole range of human emotion. Here's a funny personal favorite: https://storycorps.org/stories/betty-jenkins/[0]: https://storycorps.org/Fantastic idea! I find it really hard to give meaningful presents to people I care about. They already have everything they need, so often it's just food, wine, etc. For kids (that already have way too much toys) it's just more toys. Giving a heart warming experience is so much better, and your idea really makes that easier.Hey Ross,Amazing idea. My friend is an Investigative Journalist and as such - asks GREAT questions. I'm sure he'd love to be part of your roster. I think that's how you scale this by creating a deep roster of absolutely phenomal interviewers.I really like the product. If I had more money I would probably buy it.One thing that could be helpful is to provide to a private RSS url in addition to web link, so I can play it in my podcast player.Maybe there\u2019s a DIY version of this product? You make available the internal tools that interviews use but let me actually do the interview? To make it cheaper.I\u2019m actually about to start interviewing my own family members to get some of our history recorded, so I'm very interested in this use case.I think this might be appealing to someone like me, but that might be unfortunate.Isn't this just a suboptimal substitute for a conversation?Edit: This seems like such a great product, I'm really excited to give it a try. The Instagram snippets helped me appreciate what you are doing.This is interesting. Audio is a super interesting format, and relatively unformed as a an asynchronous communication method.I _was hoping_ from the title that it was the app that a friend and I joke about each time we send an overly long voice note message:# FriendCast TM_Voice notes for your long, winding asynchronous conversations_* Keep track of what they said, what you said* Listen once, respond as you think of it* Audio threads to stay on trackIf anyone has any thoughts on improving voice notes, I'm all ears.(yes I hate voice notes just like everyone, the stream of consciousness can be terrible, but with some people it's a great way of keeping in touch async, and extensively used)Fantastic idea. My sister and I interviewed my parents in 2005 about how they met. Now that my dad's passed, the video has become an heirloom. It's an hour-long raw video though - editing it down into a tight story with an arc would be incredible.At his eulogy I talked about how great it is that hundreds of awesome people came to say nice things about him. But it would be even nicer if he heard those things when he was alive. I encouraged attendees to go tell their loved ones how they felt before they were gone.Artifact seems like it can help with that in a very polished way.I can also easily imagine spending a few hundred $$ on this once or twice a year for various people in my life. It's on the high side (e.g. \"meaningful gift/memorable occasion\" pricing), but given your audio samples it seems price indicates quality.Good stuff. Will be following your journey and will become a customer at some point.This is interesting, as IIRC another YC alum had a (non-YC) startup that failed that did almost exactly this (which shut down a while ago) - a marketplace of interviewers that recorded stories from older family members. Dunno if you've connected with him or not but seems like it could be worthwhile to chatI love this idea! I tried to do something similar for my dad who has lead quite the interesting life and is nearing his 80's. I quickly learned that I'm neither a skilled interviewer or audio editor. The price seems like a steal to have a pro handle all of that.Congrats. Great product.Just curious - did you guys present at last year's YC hackathon? I think I remember you! (we were the social film-camera team). Congrats on the launch!I love this. I've wanted this to exist for years but never did anything with it, so I'm thrilled you have. Can you say more about the marketplace of interviewers? How do you recruit them? How are they vetted? How does matching work?Cool idea, though I'm interested in how you're thinking about scaling this? I co-founded Castup[1], a podcast editing startup and while we deliver quite a few episodes a month, I can't ever see it scale enough to require venture capital.We are profitable and have been since our second week of operation.[1] https://useCastup.comWhy does this service frighten me? And, not in the way that global warming does.This is really cool. I wish there was a way to have non-english speaking journalists as interviewers, I'd totally set them loose on many of my family members.Edit: Your homepage doesn't explain what you do well enough, IMO. When I visited it I wasn't sure what you do - but reading your post explained it to me (I think) - you hire journalists to interview your loved ones and turn it into a podcast. I didn't get that from your homepage.What an absolutely amazing idea. This hits me right in my interests. Kudos & good luck!Already a customer, this is a lovely company, wish you more success. Had some of my first interviews done this past week, solidifying information our family hasn't fully shared with each other. Its invaluable.That is really amazing idea. In quite a long time I haven't seen a startup that isn't justa) pure entertainment\nb) automation/facilitation of some boring/mundane taskWhile the above are useful without a doubt, your service provides something more than just an utility - an enormous emotional value that I really admire. Once I prepared a kind of audio commemoration for incurably ill nephew of a friend of mine and I'll never forget the look of her eyes when she got this CD. I'm sure Artifact will provide such effects with greater intensivity and scale.This is so good, the Escher & Droste effect sample is one of my favorite : https://cindyjs.org/gallery/main/Droste/Thank you for sharing this, I am generally interested in any tool that makes the creation of explorable explanations easier.However, while I am passably competent as an educator/physicist/js dev, I have a hard time understanding what the use case for this library is. Is it a plotting library mainly? A plotting library with latex built in? An ODE solver? An UI-building toolkit? The showcase gallery does not make that clear and it is pretty difficult to find example snippets that go to the core of the library and how its creator imagines it being used. Could you elaborate on why I should use this instead of vanilla javascript + my favorite UI library + my favorite canvas library?Edit: these slides seem to cover some of my questions.BeautifulFor those trying to contextualize this, CindyJS is based on Cinderella, a geometry framework developed by J\u00fcrgen Richter-Gebert. He also wrote a book called \"Perspectives on Projective Geometry\" where he goes into full detail about the mathematics behind this framework.I had previously created a hackernews thread about this (amazing) book, to see if anyone wanted to do a reading group with me on it, but didn't get many takers:\nhttps://news.ycombinator.com/item?id=23371641This looks very cool. I'm gonna be using this soon!I am curious why you'd create a new scripting language for this, how is JS limiting for building these visualizations?Kleinian fractals is super fun to play with:https://cindyjs.org/gallery/main/Kleinian/I suggest that the complete fare (A to C) should be divided by the 2 passengers. BTW John should try to stay in touch with Cindy!", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "jwhitley/requirejs-rails", "link": "https://github.com/jwhitley/requirejs-rails", "tags": [], "stars": 596, "description": "RequireJS support for your Rails 3 or 4 application", "lang": "JavaScript", "repo_lang": "", "readme": "\n# RequireJS for Rails\n\nIntegrates [RequireJS](http://requirejs.org/) into the Rails 3+ Asset Pipeline.\n\n**UPGRADE NOTES:** Users upgrading within the 0.x series should read the Changes section for relevant usage changes. We're pushing hard to 1.0, when the configuration and setup details will be declared stable. Until that time expect some bumps as things bake out.\n\n## Usage\n\n1. Add this to your Rails app's `Gemfile`:\n\n ```\n gem 'requirejs-rails'\n ```\n\n2. Remove all Sprockets directives such as `//= require jquery` from `application.js` and elsewhere. Instead establish JavaScript dependencies using AMD-style `define()` and `require()` calls.\n\n3. Use `requirejs_include_tag` at the top-level of your app's layout(s). Other modules will be pulled in dynamically by `require.js` in development and for production builds optimized by `r.js`. Here's a basic `app/views/layouts/application.html.erb` modified for `requirejs-rails`:\n\n ```erb\n \n \n \n Frobnitz Online \n <%= stylesheet_link_tag \"application\" %>\n <%= requirejs_include_tag \"application\" %>\n <%= csrf_meta_tags %>\n \n \n \n\n <%= yield %>\n\n \n \n ```\n\n4. Organize your JavaScript or CoffeeScript code into modules using `define()`:\n\n ```coffeescript\n # app/assets/javascripts/views/tweet_view.js.coffee\n\n define ['backbone'], (Backbone) ->\n class TweetView extends Backbone.View\n # ...\n ```\n\n5. Instantiate your app using `require()` from a top-level module such as `application.js`:\n\n ```coffeescript\n # app/assets/javascripts/application.js.coffee\n\n require ['jquery', 'backbone', 'TheApp'], ($, Backbone, TheApp) ->\n\n # Start up the app once the DOM is ready\n $ ->\n window.App = new TheApp()\n Backbone.history.start\n pushState: true\n window.App.start()\n ```\n\n6. When ready, build your assets for production deployment as usual.\n `requirejs-rails` defaults to a single-file build of `application.js`.\n Additional modules and r.js layered builds may be specified via\n `config/requirejs.yml`; see the Configuration section below.\n\n ```rake assets:precompile```\n\n## Configuration\n\n### The Basics\n\nConfiguration lives in `config/requirejs.yml`. These values are inspected and\nused by `requirejs-rails` and passed along as configuration for require.js and\n`r.js`. The default configuration declares `application.js` as the sole\ntop-level module. This can be overridden by creating\na `config/requirejs.yml`, such as:\n\n```yaml\nmodules:\n - name: 'mytoplevel'\n```\n\nYou may pass in [require.js config\noptions](http://requirejs.org/docs/api.html#config) as needed. For example,\nto add path parameters:\n\n```yaml\npaths:\n d3: \"d3/d3\"\n \"d3.time\": \"d3/d3.time\"\n```\n\n### Layered builds\n\nOnly modules specified in the configuration will be created as build artifacts\nby `r.js`. [Layered r.js\nbuilds](http://requirejs.org/docs/faq-optimization.html#priority) be\nconfigured like so:\n\n```yaml\nmodules:\n - name: 'appcommon'\n - name: 'page1'\n exclude: ['appcommon']\n - name: 'page2'\n exclude: ['appcommon']\npriority: ['appcommon']\n```\n\nIn this example, only modules `page1` and `page2` are intended for direct\nloading via `requirejs_include_tag`. The `appcommon` module contains\ndependencies shared by the per-page modules. As a guideline, each module in\nthe configuration should be referenced by one of:\n\n- A `requirejs_include_tag` in a template\n- Pulled in via a dynamic `require()` call. Modules which are solely\n referenced by a dynamic `require()` call (i.e. a call not optimized by r.js)\n **must** be specified in the modules section in order to produce a correct\n build.\n- Be a common library module like `appcommon`, listed in the `priority` config\n option.\n\n### Almond support\n\nThis gem supports single-file builds with\n[almond](https://github.com/jrburke/almond). Use the following setting in\n`application.rb` to enable it:\n\n```ruby\nconfig.requirejs.loader = :almond\n```\n\nAlmond builds have the restriction that there must be exactly one `modules` entry in\n`requirejs.yml`. Typically the [wrap option](https://github.com/jrburke/r.js/blob/master/build/example.build.js#L275) will be used to create a self-contained build:\n\n```yaml\nmodules:\n - name: 'main'\nwrap: true\n```\n\n### Build-time asset filter\n\nThe `requirejs-rails` build process uses the Asset Pipeline to assemble assets\nfor the `r.js` build. By default, assets ending in `.js`, `.html`, and `.txt`\nwill be made available to the build. If you have other asset suffixes to\ninclude, use the `logical_path_patterns` config setting to add them.\n\nFor example, if your templates all end in `.templ` like so...\n\n```javascript\n// in app/assets/javascripts/myapp.js\ndefine(function (require) {\n var stuff = require('text!stuff.templ');\n // ...\n});\n```\n\n... then this config setting will ensure they're picked up in the build:\n\n```ruby\n# in config/application.rb\nconfig.requirejs.logical_path_patterns += [/\\.templ$/]\n```\n\n## Advanced features\n\n### Additional data attributes\n\n`requirejs_include_tag` accepts an optional block which should return a hash.\nThis hash will be used to populate additional `data-...` attributes like so:\n\n```erb\n<%= requirejs_include_tag \"page1\" do |controller|\n { 'foo' => controller.foo,\n 'bar' => controller.bar\n }\n end\n%>\n```\n\nThis will generate a script tag like so:\n\n```\n\n```\n\n### External domain (CDN) support\n\nThere are two ways in which requirejs-rails supports the use of different\ndomains for serving built JavaScript modules, as is the case when using\na [CDN](http://en.wikipedia.org/wiki/Content_delivery_network).\n\n1. URLs in paths config in `requirejs.yml`:\n\n If requirejs-rails encounters an URL as the right-hand side of a paths\n configuration, it will correctly emit that as `\"empty:\"` during the build\n process so that [r.js will do the right thing](http://requirejs.org/docs/optimization.html#empty).\n\n Example:\n\n ```yaml\n paths:\n jquery: \"https://ajax.googleapis.com/ajax/libs/jquery/1.7.2/jquery.min.js\"\n ```\n\n2. Deploying all requirejs-rails assets to a CDN:\n\n In `config/environments/production.rb` (or another environment)\n set the run_config as follows:\n\n ```ruby\n config.requirejs.run_config['baseUrl'] = 'http://mycdn.example.com/12345abc/assets'\n ```\n\n The [`asset_sync` gem](https://github.com/rumblelabs/asset_sync) is one\n tool that can be used to deploy your built assets to a CDN (S3, in this\n case).\n\n## Troubleshooting\n\n### Avoid `config.assets.precompile`\n\nDon't set `config.assets.precompile` to reference any of your AMD module code.\nAvoid it altogether, except to reference non-AMD code that you're loading via\njavascript_include_tag, and which is **never** referenced by the AMD codebase.\n\n## Using AMD libraries\n\nI currently recommend placing your AMD libraries into\n`vendor/assets/javascripts`. The needs of a few specific libraries are\ndiscussed below.\n\n### jQuery\n\njQuery users must use jQuery 1.7 or later (`jquery-rails >= 1.0.17`) to use it as an [AMD module](https://github.com/amdjs/amdjs-api/wiki/AMD) with RequireJS. To use jQuery in a module:\n\n```coffeescript\n# app/assets/javascripts/hello.js\n\ndefine ['jquery'], ($) ->\n (id) ->\n $(id).append('hello!
')\n```\n\n### Backbone.js\n\nBackbone 0.9.x doesn't support AMD natively. I recommend the [amdjs\nfork of Backbone](https://github.com/amdjs/backbone/) which adds AMD\nsupport and actively tracks mainline.\n\n### Underscore.js\n\nUnderscore 1.3.x likewise doesn't have AMD support. Again, see\nthe [amdjs fork of Underscore](https://github.com/amdjs/underscore).\n\n## 0.x API Changes\n\nUsage changes that may break functionality for those upgrading along the 0.x\nseries are documented here. See [the Changelog](https://github.com/jwhitley/requirejs-rails/blob/master/CHANGELOG.md) for the full\nlist of feature additions, bugfixes, etc.\n\n### v0.9.2\n\n- Support for Rails 4.\n\n### v0.9.0\n\n- The upgrade to RequireJS and r.js 2.0 includes changes that will break some\n apps.\n\n### v0.5.1\n\n- `requirejs_include_tag` now generates a data-main attribute if given an argument, ala:\n\n ```erb\n <%= requirejs_include_tag \"application\" %>\n ```\n\n This usage is preferred to using a separate\n `javascript_include_tag`, which will produce errors from require.js or\n r.js if the included script uses define anonymously, or not at all.\n\n### v0.5.0\n\n- `application.js` is configured as the default top-level module for r.js builds.\n- It is no longer necessary or desirable to specify `baseUrl` explicitly in the configuration.\n- Users should migrate application configuration previously in `application.js` (ala `require.config(...)`) to `config/requirejs.yml`\n\n\n\n## TODOs\n\nPlease check out [our GitHub issues page](https://github.com/jwhitley/requirejs-rails/issues)\nto see what's upcoming and to file feature requests and bug reports.\n\n----\n\nCopyright 2011-2014 John Whitley. See the file MIT-LICENSE for terms.\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "piotte13/SIMD-Visualiser", "link": "https://github.com/piotte13/SIMD-Visualiser", "tags": ["simd", "visualisation", "intrinsics", "vectorized-computation", "compilers"], "stars": 596, "description": "A tool to graphically visualize SIMD code", "lang": "JavaScript", "repo_lang": "", "readme": "\n# Live Version found here (Proof of concept)\n[http://piotte13.github.io/SIMD-Visualiser](http://piotte13.github.io/SIMD-Visualiser)\n\n![](doc/SIMD-Visualizer-Demo.gif)\n( This is a Prototype version, we are still in development! Thanks for your support :-) \n# But wait, what is SIMD?\nSIMD (pronounced \"seem-dee\") is short for **Single Instruction/Multiple Data** which is one [classification of computer architectures](https://en.wikipedia.org/wiki/Flynn%27s_taxonomy \"classification of computer architectures\"). SIMD allows one operation to be performed on multiple data points simultaneously. Data level parallelism improves the performance of many tasks, including 3D graphics and video processing, physics simulations, and cryptography.\n\n# Why would one need to visualize it?\nThe first time I saw SIMD code, I almost had a heart attack. My brain was overwhelmed, my stress level rose, my face turned white like a sheet of paper and cold sweats started to flow all over my body. SIMD code is not designed to be easily understood by the human brain: it's made for machines.\n\nThe thing is, we still need SIMD. It's powerful and once you understand what it does, it's quite simple. So, how do we understand what it does then? We visualize it! We make it look simple with animations, colors and graphics!\n\nOur goal is to experiment with different visualization methods, until we figure out the ones that are so easy to understand that even our grandmothers would think it's a kids play. For now, we think we found a solution that allows anyone with basic computer science knowledge to understand any given SIMD code, quickly and free of hearth attacks. \n\n# Basic Features\n- Graphical Visualization. \ud83d\udd25\ud83d\udd25\n- Abstract Syntax Tree (AST) \ud83c\udf33\ud83c\udf31\n- Write, compile and find bugs in SIMD code. \ud83d\ude2e\ud83d\ude0d\n\n\n# How does it work?\nSo, you are wondering how we made it, right? Did we hire a magician? Let's see... \ud83d\ude09\n\nAt first thought, parsing C code might seem like a trivial task. But it's not, it's actually laborious, painful and brain twisting... C is a deeply complex language, therefore it cannot be parsed using only regular expressions, we need a lot more fancy techniques and effort. In addition to that, there is no available C code parser written in JavaScript that we could directly use in our project. At least, we did not find any.\n\nDon't get me wrong, it's totally possible to make one, after all, any kind of compiler has to parse code before it can tweak it and do its thing. That's where things get interesting, compilers like [Clang](https://clang.llvm.org/) already have an integrated parser, and a surprisingly good one, so why try and reinvent the wheel? Why not capitalize on what is already made and use it at our advantage? Well, that is for sure a good point... So, we tried using the compiler and it works like magic. It is fast, efficient and simple to use.\n\n Thanks to [Matt Godbolt](https://github.com/mattgodbolt/compiler-explorer) and his [Compiler Explorer](https://godbolt.org/), we were able to compile our SIMD code to assembly using any version of Clang through his free and open source REST API. We also discovered that the Clang compiler can produce an [Abstract Syntaxic Tree](https://en.wikipedia.org/wiki/Abstract_syntax_tree) that we can later use as another visualization strategy.\n\nOkay, now that we have compiled SIMD code, what do we do? \n\n 1. Parse it.\n 2. Draw it.\n 3. Animate it.\n\nAlthough we skipped the C code parsing using Clang, we still need to parse the assembly code, or else it's useless to us. One particularity of assembly code is that it's easy to parse. In JavaScript jargon, it's as simple as String.split(). But what about the complexity of the code, doesn't assembly add a lot of junk? Well, an interesting feature of compiled SIMD code is that it's quite simple, there is no additional complexity like we would find in assembly generated from traditional C code.\n\nHere's an example: \n\nC code (SIMD):\n``` \n#include \n\n__m128i PrefixSum(__m128i curr) {\n\t__m128i Add = _mm_slli_si128(curr, 4); \n\tcurr = _mm_add_epi32(curr, Add); \n\tAdd = _mm_slli_si128(curr, 8); \n\treturn _mm_add_epi32(curr, Add); \n} \n```\n\nAssembly:\n```\nPrefixSum(long long __vector(2)):\n\tvpslldq xmm1, xmm0, 4\n\tvpaddd xmm0, xmm1, xmm0\n\tvpslldq xmm1, xmm0, 8\n\tvpaddd xmm0, xmm0, xmm1\n\tret\n```\n\nLet me explain. The first column of every row represents the name of the command and the subsequent columns represent the arguments or parameters. **xmm**, **ymm** and **zmm** arguments are registry addresses. Essentially, registries are like the variables you would find in C, except there is a finite amount of them and there is no concepts such as scope and lifetime. The first letter (X, Y, Z) represents the size, which would be (128, 256, 512) bits and there is only 32 addresses of registries. Those characteristics makes registries easier to work with, compared to C variables, since a lot of complexity is therefore removed or taken care of by the compiler. To learn more about AVX registry and it's commands, read [this](https://en.wikipedia.org/wiki/Advanced_Vector_Extensions#New_instructions). \n\nNow that we have parsed every command and their parameters, we can finally start drawing and animating them! To do so, we used [React](https://reactjs.org/) as a JavaScript user interface library and [Anime.js](http://animejs.com/) for the animations.\n\n\n# Development/Contributing\nSIMD-Visualizer is a research project and for now, we are a small team! We actively encourage and support contributions. The SIMD-Visualizer source code is released under the BSD License. This license is very simple, and is friendly to all kinds of projects, whether open source or not. \n\nFeel free to fork and improve/enhance SIMD-Visualizer any way you want. If you feel that the application or the research team will benefit from your changes, please open a pull request.\n\n## Available Scripts\n\nIn the project directory, you can run:\n### `npm start`\n\nRuns the app in the development mode.\nOpen [http://localhost:3000](http://localhost:3000) to view it in the browser.\nThe page will reload if you make edits.\nYou will also see any lint errors in the console.\n### `npm run build`\n\nBuilds the app for production to the `build` folder.\nIt correctly bundles React in production mode and optimizes the build for the best performance.\n\nThe build is minified and the filenames include the hashes.\nYour app is ready to be deployed!\n\n### `npm run deploy`\n\nDeploys application to github-pages. It will build, then push the code to gh-pages branch.\n\n### `npm test`\n\nLaunches the test runner in the interactive watch mode.\nSee the section about [running tests](#running-tests) for more information.\n\n# Credits\n\nThis project is made possible by [Pierre Marie Ntang](https://github.com/pmntang). It is part of his PhD thesis in congnitive computing at [Universit\u00e9 du Qu\u00e9bec (TELUQ)](https://www.teluq.ca/site/en/). Many ideas came from his brilliant mind.\n\nThanks to [Daniel Lemire](https://github.com/lemire) for his many ideas and his deep knowledge and expertise in SIMD software. He is well known in the open source world as well as the big data community. His work is used by companies such as eBay, Facebook, LinkedIn and Netflix in their data warehouses. Git also uses his techniques to accelerate queries.\n\nBig thanks to [Matt Godbolt](https://github.com/mattgodbolt/compiler-explorer) for his free and open source REST API of the [Compiler Explorer](https://godbolt.org/), which allows us to use Clang and many other compilers from the browser.\n\n\n# License\nThe [BSD 3-clause](https://tldrlegal.com/license/bsd-3-clause-license-(revised)) license allows you almost unlimited freedom with the software so long as you include the BSD copyright and license notice in it (found in Fulltext).\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "DominikDoom/a1111-sd-webui-tagcomplete", "link": "https://github.com/DominikDoom/a1111-sd-webui-tagcomplete", "tags": [], "stars": 600, "description": "Booru style tag autocompletion for AUTOMATIC1111's Stable Diffusion web UI", "lang": "JavaScript", "repo_lang": "", "readme": "![tag_autocomplete_light_zh](https://user-images.githubusercontent.com/34448969/208307331-430696b4-e854-4458-b9e9-f6a6594f19e1.png)\n\n# Booru tag autocompletion for A1111\n\n[![GitHub release (latest SemVer)](https://img.shields.io/github/v/release/DominikDoom/a1111-sd-webui-tagcomplete)](https://github.com/DominikDoom/a1111 -sd-webui-tagcomplete/releases)\n## [English Document](./README.md)\n\n## Functional Overview\n\nThis script is a custom script of [AUTOMATIC1111 web UI](https://github.com/AUTOMATIC1111/stable-diffusion-webui), which can provide booru-style (such as Danbooru) TAG auto-completion when inputting Tags. Because some models are trained based on this TAG style (such as [Waifu Diffusion](https://github.com/harubaru/waifu-diffusion)), using these Tags can achieve more accurate results.\n\nThis script was created to reduce the repeated switching between the Web UI and the booru website due to duplicating Tags.\nYou can download or copy the file according to [the following method](#installation), or use the packaged file in [Releases](https://github.com/DominikDoom/a1111-sd-webui-tagcomplete/releases).\n\n## FAQ & Known Bugs:\n- When the `replaceUnderscores` option is turned on, the script will only replace part of the Tag if the Tag contains multiple words, such as changing `atago (azur lane)` to `taihou` and using auto-completion. You will get` taihou (azur lane), lane)`, because the script does not consider the following part as the same Tag.\n\n## Demo and Screenshots\nDemo video (using keyboard navigation):\n\nhttps://user-images.githubusercontent.com/34448969/200128020-10d9a8b2-cea6-4e3f-bcd2-8c40c8c73233.mp4\n\nWildcard support demo:\n\nhttps://user-images.githubusercontent.com/34448969/200128031-22dd7c33-71d1-464f-ae36-5f6c8fd49df0.mp4\n\nDark and light theme support, including Tag colors:\n\n![results_dark](https://user-images.githubusercontent.com/34448969/200128214-3b6f21b4-9dda-4acf-820e-5df0285c30d6.png)\n![results_light](https://user-images.githubusercontent.com/34448969/200128217-bfac8b60-6673-447b-90fd-dc6326f1618c.png)\n\n## Install\n### As an extension (recommended)\nEither clone it into your extensions folder\n```bash\ngit clone \"https://github.com/DominikDoom/a1111-sd-webui-tagcomplete.git\" extensions/tag-autocomplete\n```\n(The second parameter specifies the name of the folder, you can choose anything you like).\n\nOr manually create a folder and put the `javascript`, `scripts` and `tags` folders in it.\n\n### in the root directory (obsolete method)\nThis installation method is suitable for the old version of webui before the extension system is added, and it does not work on the current version.\n\n---\nIn both configurations, the tags folder contains `colors.json` and the tag data used by the script for autocompletion.\nBy default, Tag data includes `Danbooru.csv` and `e621.csv`.\n\nAfter scanning `/embeddings` and wildcards, the list will be stored in the `tags/temp` folder. Deleting the folder will have no effect, it will be recreated on next boot.\n\n### Notice:\n**All three folders** are required for this script to be enabled.\n\n## [Wildcard](https://github.com/jtkelm2/stable-diffusion-webui-1/blob/master/scripts/wildcards.py) & Embedding support\nAutocompletion also works for wildcard files as described in [Wildcard](https://github.com/jtkelm2/stable-diffusion-webui-1/blob/master/scripts/wildcards.py) (demo video follows) . This will allow you to insert the wildcards required by the Wildcard script, and further, you can also insert a specific Tag in the wildcard file.\n\nWhen the `__` character is entered, the wildcard files in the `/scripts/wildcards` folder will be listed in the auto-completion. When you select a specific wildcard file, all the specific tags in it will be listed, but if You just need to select a wildcard, press space.\n\nWhen the `<` character is entered, the `.pt` and `.bin` files under the `embeddings` folder will be listed to the auto-completion. It should be noted that some kaomoji also contain `<` (such as `>_<`), so they will also appear in the results.\n\nNow this feature is enabled by default, and will automatically scan `/embeddings` and `/scripts/wildcards` folders, no longer need to use `tags/wildcardNames.txt` file, users of earlier versions can delete it.\n\n## configuration file\nThe extension has a lot of configuration and customizability built in:\n\n![image](https://user-images.githubusercontent.com/34448969/204093162-99c6a0e7-8183-4f47-963b-1f172774f527.png)\n\n| Settings | Description |\n|---------|-------------|\n| tagFile | Specifies the tag file to use. You can provide a custom tag database of your liking, but since the script was developed with Danbooru tags in mind, it may not work correctly with other configurations. |\n| activeIn | Script that allows to selectively (de)activate negative prompts for txt2img, img2img and both. |\n| maxResults | The maximum number of results to display. For the default tag set, the results are sorted by occurrence. For embeds and wildcards, it will display all results in a scrollable list. |\n| showAllResults | If true, maxResults will be ignored and all results will be shown in a scrollable list. **WARNING:** With long listings, your browser may lag. |\n|resultStepLength| Allows to load results in small batches of the specified size for better performance on long lists, or when showAllResults is true. |\n|delayTime| Specifies how many milliseconds to wait before triggering autocomplete. Helps prevent too frequent updates while typing. |\n| replaceUnderscores | If true, replace underscores with spaces when a label is clicked. May be better for some models. |\n| escapeParentheses | If true, escapes tags containing () so they don't contribute to the hint weight feature of the web UI. |\n| useWildcards | Used to toggle wildcard completion. |\n| useEmbeddings | Used to toggle embedding completion. |\n| alias | Options for tag aliases. More info in the section below. |\n| translation | Options for translating tags. More info in the section below. |\n| extras | Options for additional tags files/translations. More info in the section below. |\n\n### colors.json (label colors)\nAdditionally, colors for tag types can be specified using a separate `colors.json` file in the extension's `tags` folder.\nYou can also add new here (same name as file without .csv) for custom label files. The first value is dark mode and the second value is light mode. Both color names and hex codes are supported.\n```json\n{\n\"danbooru\": {\n\"-1\": [\"red\", \"maroon\"],\n\"0\": [\"lightblue\", \"dodgerblue\"],\n\"1\": [\"indianred\", \"firebrick\"],\n\"3\": [\"violet\", \"darkorchid\"],\n\"4\": [\"lightgreen\", \"darkgreen\"],\n\"5\": [\"orange\", \"dark orange\"]\n},\n\"e621\": {\n\"-1\": [\"red\", \"maroon\"],\n\"0\": [\"lightblue\", \"dodgerblue\"],\n\"1\": [\"gold\", \"goldenrod\"],\n\"3\": [\"violet\", \"darkorchid\"],\n\"4\": [\"lightgreen\", \"darkgreen\"],\n\"5\": [\"tomato\", \"darksalmon\"],\n\"6\": [\"red\", \"maroon\"],\n\"7\": [\"white smoke\", \"black\"],\n\"8\": [\"seagreen\", \"darkseagreen\"]\n}\n}\n```\nThe number specifies the type of label, which depends on the source of the label. For example, see [CSV tag data](#csv-tag-data).\n\n### Alias, translation & new Tag\n#### Aliases\nLike the Booru site, a tag can have one or more aliases, redirecting to the actual value when done. These will be searched/displayed based on the settings in `config.json`.\n- `searchByAlias` - Whether to also search for aliases, or just actual tags.\n- `onlyShowAlias` - only show aliases, not `alias->actual`. For display only, the final text is still the actual label.\n\n#### translate\nAn extra file can be added in the translations section, which will be used to translate tags and aliases, while also being searchable by translation.\nThis file needs to be `,` in CSV format, but for backwards compatibility with older extra files using the three-column format, you can turn on `oldFormat` instead.\n\nExamples of full and partial Chinese label sets:\n\n![IME-input](https://user-images.githubusercontent.com/34448969/200126551-2264e9cc-abb2-4450-9afa-43f362a77ab0.png)\n![english-input](https://user-images.githubusercontent.com/34448969/200126513-bf6b3940-6e22-41b0-a369-f2b4640f87d6.png)\n\n#### Extra file\nExtra files can be used to add new/custom tags not included in the main set.\nIts format is the same as the normal tag format in [CSV tag data](#csv-tag-data) below, with one exception.\nSince custom tags don't have a post count, the third column (or second column if counting from zero) is used to display the gray meta text next to the tag.\nIf left blank, it will display \"Custom tag\".\n\nTake the default (very basic) extra-quality-tags.csv as an example:\n\n![image](https://user-images.githubusercontent.com/34448969/218264276-cd77ba8e-62d8-41a2-b03c-6c04887ee18b.png)\n\nYou can choose in the settings whether custom tags should be added before or after regular tags.\n\n### CSV tag data\nThe Tag file format of this script is as follows, you can install this format to make your own Tag file:\n```csv\n1girl,0,4114588,\"1girls,sole_female\"\nsolo,0,3426446,\"female_solo,solo_female\"\nhighres,5,3008413,\"high_res,high_resolution,hires\"\nlong_hair,0,2898315,longhair\ncommentary_request,5,2610959,\n```\nIt's worth noting that you don't want to have column names on the first row, and both count and aliases are technically optional.\nAlthough count is always included in the default data. Multiple aliases also need to be separated by commas, but wrapped in string quotes so as not to break CSV parsing.\nThe numbering system follows Danbooru's [tag API docs](https://danbooru.donmai.us/wiki_pages/api%3Atags):\n| Value | Description |\n|-------|-------------|\n|0 | General |\n|1 | Artist |\n|3 | Copyright |\n|4 | Character |\n|5 | Meta |\n\nSimilar to e621:\n| Value | Description |\n|-------|-------------|\n|-1 | Invalid |\n|0 | General |\n|1 | Artist |\n|3 | Copyright |\n|4 | Character |\n|5 | Species |\n|6 | Invalid |\n|7 | Meta |\n|8 | Lore |\n\nThe marker type is used to color the entries in the resulting list.", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "hzlzh/Front-End-Standards", "link": "https://github.com/hzlzh/Front-End-Standards", "tags": [], "stars": 596, "description": "\u9002\u7528\u4e8e\u5c0f\u56e2\u961f\u7684\u524d\u7aef\u89c4\u8303", "lang": "JavaScript", "repo_lang": "", "readme": "# front-end specification\n\nThis is the code writing specification followed and agreed by the `front-end development team`, which is intended to improve the standardization and maintainability of the code.\nThis specification is a reference specification, not all mandatory requirements, and unifies the team's coding standards and styles. Make all codes follow rules and be precipitated to reduce duplication of work.\n\nAccess address: [http://hzlzh.github.io/Front-End-Standards/]\n\nMore:\n\n* Support MarkDown <-> HTML preview <-> HTML source switching method\n* Support code syntax highlighting\n* Note the recommended front-end development tools\n\n## contribute\n\nThe `Frontend Specification` is hosted on `Github Page`, `Fork` & `Star` to make it more powerful.\n\n* Pages Git branch: * `gh-pages`\n\n## License\n\nAvailable under MIT. See [LICENSE] for more details.\n\n[http://hzlzh.github.io/Front-End-Standards/]: http://hzlzh.github.io/Front-End-Standards/ 'Front-End-Standards'\n[LICENSE]: http://rem.mit-license.org 'MIT License'", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "cyb3rfox/Aurora-Incident-Response", "link": "https://github.com/cyb3rfox/Aurora-Incident-Response", "tags": ["incident-response", "incident-management", "incident-response-tooling"], "stars": 596, "description": "Incident Response Documentation made easy. Developed by Incident Responders for Incident Responders", "lang": "JavaScript", "repo_lang": "", "readme": "# Aurora Incident Response\n\nIncident Response Documentation made easy. Developed by Incident Responders for Incident Responders.\nAurora brings \"Spreadsheet of Doom\" used in the SANS FOR508 class to the next level. Having led many cases and taught so many students how to do IR right, I realized, that many struggle\nwith keeping control over all the findings. That does not only prevent them from seeing what they already have, but even less so what they are missing. \n\nIt's intended to be used in small and big incident response investigations to track findings, tasks, making reporting easy and generally stay on top of the game. The current version has been battle tested multiple times now. \nI'll keep fixing bugs and adding features as we go, but please remember, it's a leisure time project. So any help is appreciated.\n\nLateral Movement\n![alt text](./images/lateral.png \"Lateral Movement Visualization\")\nVisual Timeline\n![alt text](./images/timeline.png \"Visual Timeline\")\n\n\n\n## 1 Download & Installation\n\nYou can download the current release of Aurora Incident Response from the [Releases Page](https://github.com/cyb3rfox/Aurora-Incident-Response/releases).\nAurora Incident Response is available for MacOS, Windows and Linux. We are working on making it available for\niPads and Android tablets as well.\n\nHere's a video on how to use Aurora:\n\n[![](http://img.youtube.com/vi/2j2XYcqQIm0/0.jpg)](http://www.youtube.com/watch?v=2j2XYcqQIm0 \"\")\n\n## 2 Development\n\nIf you want to contribute, you are encouraged to do so. I'd totally like to see the tool growing. \nThe whole application is built on an electron base and written in plain Javascript and HTML.\nEven though technically I could have used node.js modules for functionality like Webdav I refrained from it.\nThe reason is, that node modules will not run out of the box when migrating the code to phonegap for IOS and Android.\nThe good news is, it's really fast to set up your development environment. I personally use Webstorm but it should work with pretty much any IDE.\n\n### 2.1 Set up your build environment\n\nAs pointed out in the description, Aurora Incident Response is built on top of Electron which allows for multi platform compatibility.\nYou can easily install your tool chain the following way.\n\nStart by installing `node.js`. Follow the links to their [download page](https://nodejs.org/en/download/).\n\nWith `nodejs` installed, checkout the [Aurora Github repository](git clone https://github.com/cyb3rfox/Aurora-Incident-Response) (or fork first if you want to contribute).\n\ngit clone https://github.com/cyb3rfox/Aurora-Incident-Response
\n\ncd Aurora-Incident-Response/src\n
\n\nNow you need to install Electron using node. Currently Aurora is configured to run with `electron` 4.0.6. \n\nnpm install electron@4.0.6
\n\nYou can now run the code by invoking:\n\nnode_modules/.bin/electron .
\n\nThat's fast, isn't it?\n\n### 2.2 Roadmap\n\nThe following points are already on the roadmap. Please just post a new issue or send a message on [Twitter](https://twitter.com/cyberfox) if you got any suggestions for new improvements.\n\nYou can checkout the planned feature for the nex releases under [projects](https://github.com/cyb3rfox/Aurora-Incident-Response/projects).\n\n### 2.3 Build executables for distribution\n\nTo build and cross build you I use `electron-packager`.\n \nnpm install electron-packager
\n\nBuild for Windows:\n\n./node_modules/.bin/electron-packager . Aurora --asar --prune --platform=win32 --electron-version=4.0.6 --arch=x64 --icon=icon/aurora.ico --out=release-builds --ignore \"node_modules/\\.bin\"
\n\nBuild for MacOS:\n\n./node_modules/.bin/electron-packager ./src Aurora --overwrite --platform=darwin --arch=x64 --icon=icon/aurora.icns --prune=true --out=release-builds
\n\nBuild for Linux:\n\n./node_modules/.bin/electron-packager . Aurora --asar --prune --platform=linux --electron-version=4.0.6 --arch=x64 --icon=icon/aurora.ico --out=release-builds --ignore \"node_modules/\\.bin\"
\n\n### 2.4 Sourcecode Navigator\n\nThis section describes the various sourcecode files. For now I need to keep this section small. I tried to comment in the code as good as I can. If you got any questions, just ping me. If you want to join me developing the tool, there's a slack channel to communicate. Drop me a note and I will invite you.\n\n#### 2.4.1 `main.js`\n\nElectron apps differentiates between a main (background) and a render process(chromium browser window). This file controls the main process. \nFor aurora you usually only need to go there if you want to turn on the Javascript console in the Aurora window. Just unquote the following line:\n\nwin.webContents.openDevTools()
\n\nThe second thing that's handled there is autosaving and unlocking when you exit the program. For that the main and the render process share a global variable called global.Dirty.is_dirty
that is used to signal to the main process if it can quit right away or if the file needs to be sanitized before exiting.\nIt's actually a very similar concept to the NTFS dirty bit.\n \n#### 2.4.2 `index.html`\n`\nThis is the main Aurora file that strings together all scripts and stylesheets. It also initiates the GUI. other than that it has no functionality.\n \n#### 2.4.3 `gui_definitions.js`\n\nI tried as good as I can to separate code an design. This file holds all the definition json for the `w2ui` GUI. There is some code left in there\nfor the renderers that format certain columns. It didn't make sense to place them anywhere else.\n \n#### 2.4.4 `controller.js`\n\nThe controller injects the handling functions for gui events. So whenever a button is pressed, or any other event needed happens, controller.js handles what happens.\n \n#### 2.4.5 `gui_functions.js`\n\nEvery now and then operations happen that change something in the GUI. That could e. g. be making all the datafields readonly when you don't have the lock or simply opening a popup.\nAll these functions are located in this file which is a plain JSON file.\n \n#### 2.4.6 `data.js`\n\nWhile the actual data is stored in the `w2ui` datasctructures, for saving and some other operations we need to bring it into out format. \nTransformations like this and all logic regarding saving and opening files is located here.\n \n#### 2.4.7 `data_template.js`\n \nThis holds templates for the internal data format of that version. Current format version is 3. \n \n#### 2.4.8 `misp.js`\n \nCode for MISP integration.\n \n#### 2.4.9 `virustotal.js`\n \nCode for VT integration.\n \n#### 2.4.10 `settings.js`\n \nSettings for different libraries. Currently only defines the time field format for `w2ui`.\n \n#### 2.4.11 `helper_functions.js`\n\nSmall helper functions that do not fit anywhere else.\n\n#### 2.4.12 `import.js`\n\nHandles CSV imports\n\n#### 2.4.13 `exports.js`\n\nHandles CSV exports\n\n## 3 Licensing\n\nAurora is licensed under the Apache 2 License.\n\n## 4 Credits\nProjects like this can only be realized because many people invested thousands of hours into writing cool libraries and other software. Others contribute professional UI items. Thank you for all your great work. Namely I build Aurora based on the following dependencies:\n\n* Electron https://www.electronjs.org\n* jquery https://jquery.com\n* w2ui http://w2ui.com/web/\n* vis.js https://visjs.org\n* icons8 https://icons8.com\n* Fontawsome https://fontawesome.com\n\nBesides the incredible amount of work that people invested into these projects, you need other support as well. Writing the code is easy, but making it a tool the works in reality depends on a vast amount of experience from many incident responders. \nHere I particularly want to mention the members of my IR team who contributed their knowledge and helped testing the tool in real world cases:\n\n* Rothi\n* Bruno\n* Sandro\n\nEven though this is a side and weekend project it's still good to know, that my employer Infoguard AG supports me in any way they can. Thank you particularly to:\n\n* Ernesto\n* Thomas\n\nThe following people have contributed changes that have a significant impact on the tool:\n\n* F\u00e9lix Brezo, Ph. D. (working on visualization parts)\n\n\n\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "ethereum/meteor-dapp-wallet", "link": "https://github.com/ethereum/meteor-dapp-wallet", "tags": [], "stars": 596, "description": null, "lang": "JavaScript", "repo_lang": "", "readme": "# Ethereum Wallet \u00d0app\n\nThe Ethereum wallet.\n\n[![Build Status](https://travis-ci.org/ethereum/meteor-dapp-wallet.svg?branch=master)](https://travis-ci.org/ethereum/meteor-dapp-wallet)\n\n**PLEASE NOTE:** This wallet is not yet officially released,\nand can contain severe bugs! Please use at your own risk.\n\n## Install\n\nIf you don't have [Meteor](https://www.meteor.com/install):\n\n $ curl https://install.meteor.com/ | sh\n\nInstall npm dependencies:\n\n $ cd meteor-dapp-wallet/app\n $ npm install\n\n## Development\n\nStart a `geth` node:\n\n $ geth --ws --wsorigins \"http://localhost:3000\" --unlock \n\nRun dev server:\n\n $ cd meteor-dapp-wallet/app\n $ meteor\n\nNavigate to http://localhost:3000\n\n## Deployment\n\nTo create a build:\n\n $ npm install -g meteor-build-client\n $ cd meteor-dapp-wallet/app\n $ npm install\n $ meteor-build-client ../build --path \"\"\n\nThis will generate the files in the `../build` folder.\n\nNavigating to `index.html` will start the app, but you will need to serve it over a local server like [MAMP](https://www.mamp.info).\n\n---\n\nTo deploy to the **wallet.ethereum.org** site, execute these commands:\n\n $ git checkout gh-pages\n $ git merge develop\n $ cd app\n $ meteor-build-client ../build --path \"/\"\n\nAnd push (or PR) your changes to the `gh-pages` branch.\n\n---\n\n## Gas usage statistics\n\n- Deploy original wallet: 1 230 162\n- Deploy wallet stub: 184 280\n- Simple Wallet transaction: 64 280\n- Multisig Wallet transaction below daily limit: 79 280\n- Multisig Wallet transaction above daily limit: 171 096\n- 1 Multisig confirmation: 48 363\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "CulturalMe/meteor-slingshot", "link": "https://github.com/CulturalMe/meteor-slingshot", "tags": [], "stars": 595, "description": "Upload files directly to AWS S3, Google Cloud Storage and others in meteor", "lang": "JavaScript", "repo_lang": "", "readme": "meteor-slingshot\n================\n\n[![](https://api.travis-ci.org/CulturalMe/meteor-slingshot.svg)](https://travis-ci.org/CulturalMe/meteor-slingshot) [![Gitter](https://badges.gitter.im/Join%20Chat.svg)](https://gitter.im/CulturalMe/meteor-slingshot?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)\n\nDirect and secure file-uploads to AWS S3, Google Cloud Storage and others.\n\n## Install\n\n```bash\nmeteor add edgee:slingshot\n```\n\n## Why?\n\nThere are many many packages out there that allow file uploads to S3,\nGoogle Cloud and other cloud storage services, but they usually rely on the\nmeteor apps' server to relay the files to the cloud service, which puts the\nserver under unnecessary load.\n\nmeteor-slingshot uploads the files directly to the cloud service from the\nbrowser without ever exposing your secret access key or any other sensitive data\nto the client and without requiring public write access to cloud storage to the\nentire public.\n\n \n\nFile uploads can not only be restricted by file-size and file-type, but also by\nother stateful criteria such as the current meteor user.\n\n## Quick Example\n\n### Client side\n\nOn the client side we can now upload files through to the bucket:\n\n```JavaScript\nvar uploader = new Slingshot.Upload(\"myFileUploads\");\n\nuploader.send(document.getElementById('input').files[0], function (error, downloadUrl) {\n if (error) {\n // Log service detailed response.\n console.error('Error uploading', uploader.xhr.response);\n alert (error);\n }\n else {\n Meteor.users.update(Meteor.userId(), {$push: {\"profile.files\": downloadUrl}});\n }\n});\n```\n\n### Client and Server\n\nThese file upload restrictions are validated on the client and then appended to\nthe directive on the server side to enforce them:\n\n```JavaScript\nSlingshot.fileRestrictions(\"myFileUploads\", {\n allowedFileTypes: [\"image/png\", \"image/jpeg\", \"image/gif\"],\n maxSize: 10 * 1024 * 1024 // 10 MB (use null for unlimited).\n});\n```\n\nImportant: The `fileRestrictions` must be declared before the the directive is instantiated.\n\n### Server side\n\nOn the server we declare a directive that controls upload access rules:\n\n```JavaScript\nSlingshot.createDirective(\"myFileUploads\", Slingshot.S3Storage, {\n bucket: \"mybucket\",\n\n acl: \"public-read\",\n\n authorize: function () {\n //Deny uploads if user is not logged in.\n if (!this.userId) {\n var message = \"Please login before posting files\";\n throw new Meteor.Error(\"Login Required\", message);\n }\n\n return true;\n },\n\n key: function (file) {\n //Store file into a directory by the user's username.\n var user = Meteor.users.findOne(this.userId);\n return user.username + \"/\" + file.name;\n }\n});\n```\n\nWith the directive above, no other files than images will be allowed. The\npolicy is directed by the meteor app server and enforced by AWS S3.\n\nNote: If your bucket is created in any region other than `US Standard`, you will need to set the `region` key in the directive. Refer the [AWS Slingshot Storage Directives](#aws-s3-slingshots3storage)\n\n## Storage services\n\nThe client side is agnostic to which storage service is used. All it\nneeds for the file upload to work, is a directive name.\n\nThere is no limit imposed on how many directives can be declared for each\nstorage service.\n\nStorage services are pluggable in Slingshot and you can add support for own\nstorage service as described in a section below.\n\n## Progress bars\n\nYou can create file upload progress bars as follows:\n\n```handlebars\n\n \n
\n {{progress}}% Complete \n
\n
\n \n```\n\nUsing the `Slingshot.Upload` instance read and react to the progress:\n\n```JavaScript\nTemplate.progressBar.helpers({\n progress: function () {\n return Math.round(this.uploader.progress() * 100);\n }\n});\n```\n\n## Show uploaded file before it is uploaded (latency compensation)\n\n```handlebars\n\n \n \n```\n\n```JavaScript\nTemplate.myPicture.helpers({\n url: function () {\n //If we are uploading an image, pass true to download the image into cache.\n //This will preload the image before using the remote image url.\n return this.uploader.url(true);\n }\n});\n```\n\nThis to show the image from the local source until it is uploaded to the server.\nIf Blob URL's are not available it will attempt to use `FileReader` to generate\na base64-encoded url representing the data as a fallback.\n\n## Add meta-context to your uploads\n\nYou can add meta-context to your file-uploads, to make your requests more\nspecific on where the files are to be uploaded.\n\nConsider the following example...\n\nWe have an app that features picture albums. An album belongs to a user and\nonly that user is allowed to upload picture to it. In the cloud each album has\nits own directory where its pictures are stored.\n\nWe declare our client-side uploader as follows:\n\n```JavaScript\nvar metaContext = {albumId: album._id}\nvar uploadToMyAlbum = new Slingshot.Upload(\"picturealbum\", metaContext);\n```\n\nOn the server side the directive can now set the key accordingly and check if\nthe user is allowed post pictures to the given album:\n\n```JavaScript\nSlingshot.createDirective(\"picturealbum\", Slingshot.GoogleCloud, {\n acl: \"public-read\",\n\n authorize: function (file, metaContext) {\n var album = Albums.findOne(metaContext.albumId);\n\n //Denied if album doesn't exist or if it is not owned by the current user.\n return album && album.userId === this.userId;\n },\n\n key: function (file, metaContext) {\n return metaContext.albumId + \"/\" + Date.now() + \"-\" + file.name;\n }\n});\n```\n## Manual Client Side validation\n\nYou can check if a file uploadable according to file-restrictions as follows:\n\n```JavaScript\nvar uploader = new Slingshot.Upload(\"myFileUploads\");\n\nvar error = uploader.validate(document.getElementById('input').files[0]);\nif (error) {\n console.error(error);\n}\n```\n\nThe validate method will return `null` if valid and returns an `Error` instance\nif validation fails.\n\n\n### AWS S3\n\nYou will need a`AWSAccessKeyId` and `AWSSecretAccessKey` in `Meteor.settings`\nand a bucket with the following CORS configuration:\n\n```xml\n\n\n \n * \n PUT \n POST \n GET \n HEAD \n 3000 \n * \n \n \n```\n\nDeclare AWS S3 Directives as follows:\n\n```JavaScript\nSlingshot.createDirective(\"aws-s3-example\", Slingshot.S3Storage, {\n //...\n});\n```\n\n#### S3 with temporary AWS Credentials (Advanced)\n\nFor extra security you can use\n[temporary credentials](http://docs.aws.amazon.com/STS/latest/UsingSTS/CreatingSessionTokens.html) to sign upload requests.\n\n```JavaScript\nvar sts = new AWS.STS(); // Using the AWS SDK to retrieve temporary credentials.\n\nSlingshot.createDirective('myUploads', Slingshot.S3Storage.TempCredentials, {\n bucket: 'myBucket',\n temporaryCredentials: Meteor.wrapAsync(function (expire, callback) {\n //AWS dictates that the minimum duration must be 900 seconds:\n var duration = Math.max(Math.round(expire / 1000), 900);\n\n sts.getSessionToken({\n DurationSeconds: duration\n }, function (error, result) {\n callback(error, result && result.Credentials);\n });\n })\n});\n```\n\nIf you are running slingshot on an EC2 instance, you can conveniantly retreive\nyour access keys with [`AWS.EC2MetadataCredentials`](http://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/EC2MetadataCredentials.html):\n\n```JavaScript\nvar credentials = new AWS.EC2MetadataCredentials();\n\nvar updateCredentials = Meteor.wrapAsync(credentials.get, credentials);\n\nSlingshot.createDirective('myUploads', Slingshot.S3Storage.TempCredentials, {\n bucket: 'myBucket',\n temporaryCredentials: function () {\n if (credentials.needsRefresh()) {\n updateCredentials();\n }\n\n return {\n AccessKeyId: credentials.accessKeyId,\n SecretAccessKey: credentials.secretAccessKey,\n SessionToken: credentials.sessionToken\n };\n }\n});\n```\n\n### Google Cloud\n\n[Generate a private key](http://goo.gl/kxt5qz) and convert it to a `.pem` file\nusing openssl:\n\n```\nopenssl pkcs12 -in google-cloud-service-key.p12 -nodes -nocerts > google-cloud-service-key.pem\n```\n\nSetup CORS on the bucket:\n\n```\ngsutil cors set docs/gs-cors.json gs://mybucket\n```\n\nSave this file into the `/private` directory of your meteor app and add this\nline to your server-side code:\n\n```JavaScript\nSlingshot.GoogleCloud.directiveDefault.GoogleSecretKey = Assets.getText('google-cloud-service-key.pem');\n```\nDeclare Google Cloud Storage Directives as follows:\n\n```JavaScript\nSlingshot.createDirective(\"google-cloud-example\", Slingshot.GoogleCloud, {\n //...\n});\n```\n\n### Rackspace Cloud Files\n\nYou will need a`RackspaceAccountId` (your acocunt number) and\n`RackspaceMetaDataKey` in `Meteor.settings`.\n\nIn order to obtain your `RackspaceMetaDataKey` (a.k.a. Account-Meta-Temp-Url-Key)\nyou need an\n[auth-token](http://docs.rackspace.com/loadbalancers/api/v1.0/clb-getting-started/content/Generating_Auth_Token.html)\nand then follow the\n[instructions here](http://docs.rackspace.com/files/api/v1/cf-devguide/content/Set_Account_Metadata-d1a666.html).\n\nNote that API-Key, Auth-Token, Meta-Data-Key are not the same thing:\n\nAPI-Key is what you need to obtain an Auth-Token, which in turn is what you need\nto setup CORS and to set your Meta-Data-Key. The auth-token expires after 24 hours.\n\nFor your directive you need container and provide its name, region and cdn.\n\n```JavaScript\nSlingshot.createDirective(\"rackspace-files-example\", Slingshot.RackspaceFiles, {\n container: \"myContainer\", //Container name.\n region: \"lon3\", //Region code (The default would be 'iad3').\n\n //You must set the cdn if you want the files to be publicly accessible:\n cdn: \"https://abcdefghije8c9d17810-ef6d926c15e2b87b22e15225c32e2e17.r19.cf5.rackcdn.com\",\n\n pathPrefix: function (file) {\n //Store file into a directory by the user's username.\n var user = Meteor.users.findOne(this.userId);\n return user.username;\n }\n});\n```\n\nTo setup CORS you also need to your Auth-Token from above and use:\n\n```bash\ncurl -I -X POST -H 'X-Auth-Token: yourAuthToken' \\\n -H 'X-Container-Meta-Access-Control-Allow-Origin: *' \\\n -H 'X-Container-Meta-Access-Expose-Headers: etag location x-timestamp x-trans-id Access-Control-Allow-Origin' \\\n https://storage101.containerRegion.clouddrive.com/v1/MossoCloudFS_yourAccoountNumber/yourContainer\n```\n\n\n### Cloudinary\n\nCloudinary is supported via a 3rd party package. \n[jimmiebtlr:cloudinary](https://atmospherejs.com/jimmiebtlr/slingshot-cloudinary)\n\n## Browser Compatibility\n\nCurrently the uploader uses `XMLHttpRequest 2` to upload the files, which is not\nsupported on Internet Explorer 9 and older versions of Internet Explorer.\n\nThis can be circumvented by falling back to iframe uploads in future versions,\nif required.\n\nLatency compensation is available in Internet Explorer 10.\n\n## Security\n\nThe secret key never leaves the meteor app server. Nobody will be able to upload\nanything to your buckets outside of your meteor app.\n\nInstead of using secret access keys, Slingshot uses a policy document that is\nsent to along with the file AWS S3 or Google Cloud Storage. This policy is\nsigned by the secret key and contains all the restrictions that you define in\nthe directive. By default a signed policy expires after 5 minutes.\n\n## Adding Support for other storage Services\n\nCloud storage services are pluggable in Slingshot. You can add support for a\ncloud storage service of your choice. All you need is to declare an object\nwith the following parameters:\n\n```JavaScript\nMyStorageService = {\n\n /**\n * Define the additional parameters that your your service uses here.\n *\n * Note that some parameters like maxSize are shared by all services. You do\n * not need to define those by yourself.\n */\n\n\n directiveMatch: {\n accessKey: String,\n\n options: Object,\n\n foo: Match.Optional(Function)\n },\n\n /**\n * Here you can set default parameters that your service will use.\n */\n\n directiveDefault: {\n options: {}\n },\n\n\n /**\n *\n * @param {Object} method - This is the Meteor Method context.\n * @param {Object} directive - All the parameters from the directive.\n * @param {Object} file - Information about the file as gathered by the\n * browser.\n * @param {Object} [meta] - Meta data that was passed to the uploader.\n *\n * @returns {UploadInstructions}\n */\n\n upload: function (method, directive, file, meta) {\n var accessKey = directive.accessKey;\n\n var fooData = directive.foo && directive.foo.call(method, file, meta);\n\n //Here you need to make sure that all parameters passed in the directive\n //are going to be enforced by the server receiving the file.\n\n return {\n // Endpoint where the file is to be uploaded:\n upload: \"https://example.com\",\n\n // Download URL, once the file uploaded:\n download: directive.cdn || \"https://example.com/\" + file.name,\n\n // POST data to be attached to the file-upload:\n postData: [\n {\n name: \"accessKey\",\n value: accessKey\n },\n {\n name: \"signature\",\n value: signature\n }\n //...\n ],\n\n // HTTP headers to send when uploading:\n headers: {\n \"x-foo-bar\": fooData\n }\n };\n },\n\n /**\n * Absolute maximum file-size allowable by the storage service.\n */\n\n maxSize: 5 * 1024 * 1024 * 1024\n};\n```\n\nExample Directive:\n\n```JavaScript\nSlingshot.createDirective(\"myUploads\", MyStorageService, {\n accessKey: \"a12345xyz\",\n foo: function (file, metaContext) {\n return \"bar\";\n }\n});\n```\n\n## Dependencies\n\nMeteor core packages:\n\n * underscore\n * tracker\n * reactive-var\n * check\n\n## Troubleshooting and Help\n\nIf you are having any queries about how to use slingshot, or how to get it to work with\nthe different services or any other general questions about it, please [post a question on Stack Overflow](http://stackoverflow.com/questions/ask?tags=meteor-slingshot). You will get a high\nquality answer there much quicker than by posting an issue here on github.\n\nBug reports, Feature Requests and Pull Requests are always welcome.\n\n## API Reference\n\n### Directives\n\n#### General (All Services)\n\n`authorize`: Function (**required** unless set in File Restrictions)\n\n`maxSize`: Number (**required** unless set in File Restrictions)\n\n`allowedFileTypes` RegExp, String or Array (**required** unless set in File\nRestrictions)\n\n`cdn` String (optional) - CDN domain for downloads.\ni.e. `\"https://d111111abcdef8.cloudfront.net\"`\n\n`expire` Number (optional) - Number of milliseconds in which an upload\nauthorization will expire after the request was made. Default is 5 minutes.\n\n#### AWS S3 (`Slingshot.S3Storage`)\n\n`region` String (optional) - Default is `Meteor.settings.AWSRegion` or\n\"us-east-1\". [See AWS Regions](http://docs.aws.amazon.com/general/latest/gr/rande.html#s3_region)\n\n`AWSAccessKeyId` String (**required**) - Can also be set in `Meteor.settings`.\n\n`AWSSecretAccessKey` String (**required**) - Can also be set in `Meteor.settings`.\n\n#### AWS S3 with Temporary Credentials (`Slingshot.S3Storage.TempCredentials`)\n\n`region` String (optional) - Default is `Meteor.settings.AWSRegion` or\n\"us-east-1\". [See AWS Regions](http://docs.aws.amazon.com/general/latest/gr/rande.html#s3_region)\n\n`temporaryCredentials` Function (**required**) - Function that generates temporary\ncredentials. It takes a signle argument, which is the minumum desired expiration\ntime in milli-seconds and it returns an object that contains `AccessKeyId`,\n`SecretAccessKey` and `SessionToken`.\n\n#### Google Cloud Storage (`Slingshot.GoogleCloud`)\n\n`bucket` String (**required**) - Name of bucket to use. The default is\n`Meteor.settings.GoogleCloudBucket`.\n\n`GoogleAccessId` String (**required**) - Can also be set in `Meteor.settings`.\n\n`GoogleSecretKey` String (**required**) - Can also be set in `Meteor.settings`.\n\n#### AWS S3 and Google Cloud Storage\n\n`bucket` String (**required**) - Name of bucket to use. The default is\n`Meteor.settings.GoogleCloudBucket`. For AWS S3 the default bucket is\n`Meteor.settings.S3Bucket`.\n\n`bucketUrl` String or Function (optional) - Override URL to which files are\n uploaded. If it is a function, then the first argument is the bucket name. This\n url also used for downloads unless a cdn is given.\n\n`key` String or Function (**required**) - Name of the file on the cloud storage\nservice. If a function is provided, it will be called with `userId` in the\ncontext and its return value is used as the key. First argument is file info and\nthe second is the meta-information that can be passed by the client.\n\n`acl` String (optional)\n\n`cacheControl` String (optional) - RFC 2616 Cache-Control directive\n\n`contentDisposition` String or Function (optional) - RFC 2616\nContent-Disposition directive. Default is the uploaded file's name (inline). If\nit is a function then it takes the same context and arguments as the `key`\nfunction. Use null to disable.\n\n#### Rackspace Cloud (`Slingshot.RackspaceFiles`)\n\n`RackspaceAccountId` String (**required**) - Can also be set in `Meteor.settings`.\n\n`RackspaceMetaDataKey` String (**required**) - Can also be set in `Meteor.settings`.\n\n`container` String (**required**) - Name of container to use.\n\n`region` String (optional) - Data Center region. The default is `\"iad3\"`.\n[See other regions](http://docs.rackspace.com/files/api/v1/cf-devguide/content/Service-Access-Endpoints-d1e003.html)\n\n`pathPrefix` String or Function (**required**) - Similar to `key` for S3, but\nwill always be appended by `file.name` that is provided by the client.\n\n`deleteAt` Date (optional) - Absolute time when the uploaded file is to be\ndeleted. _This attribute is not enforced at all. It can be easily altered by the\nclient_\n\n`deleteAfter` Number (optional) - Same as `deleteAt`, but relative.\n\n### File restrictions\n\n`authorize` Function (optional) - Function to determines if upload is allowed.\n\n`maxSize` Number (optional) - Maximum file-size (in bytes). Use `null` or `0`\nfor unlimited.\n\n`allowedFileTypes` RegExp, String or Array (optional) - Allowed MIME types. Use\nnull for any file type.\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "shamahoque/mern-social", "link": "https://github.com/shamahoque/mern-social", "tags": ["reactjs", "nodejs", "expressjs", "mongodb", "mern-stack", "mern", "full-stack", "web-application", "social-media"], "stars": 596, "description": "A MERN stack based social media application [Full-Stack React Projects]", "lang": "JavaScript", "repo_lang": "", "readme": "# MERN Social 2.0\n- *Looking for the first edition code? [Check here](https://github.com/shamahoque/mern-social/tree/master)*\n\nA simple social media application with users, posts, likes and comments - developed using React, Node, Express and MongoDB. \n\n![MERN Social](https://s3.amazonaws.com/mernbook/git+/social.png \"MERN Social\")\n\n### [Live Demo](http://social2.mernbook.com/ \"MERN Social\")\n\n#### What you need to run this code\n1. Node (13.12.0)\n2. NPM (6.14.4) or Yarn (1.22.4)\n3. MongoDB (4.2.0)\n\n#### How to run this code\n1. Make sure MongoDB is running on your system \n2. Clone this repository\n3. Open command line in the cloned folder,\n - To install dependencies, run ``` npm install ``` or ``` yarn ```\n - To run the application for development, run ``` npm run development ``` or ``` yarn development ```\n4. Open [localhost:3000](http://localhost:3000/) in the browser\n---- \n### More applications built using this stack\n\n* [MERN Skeleton](https://github.com/shamahoque/mern-social/tree/second-edition)\n* [MERN Classroom](https://github.com/shamahoque/mern-classroom)\n* [MERN Marketplace](https://github.com/shamahoque/mern-marketplace/tree/second-edition)\n* [MERN Expense Tracker](https://github.com/shamahoque/mern-expense-tracker)\n* [MERN Mediastream](https://github.com/shamahoque/mern-mediastream/tree/second-edition)\n* [MERN VR Game](https://github.com/shamahoque/mern-vrgame/tree/second-edition)\n\nLearn more at [mernbook.com](http://www.mernbook.com/)\n\n----\n## Get the book\n#### [Full-Stack React Projects - Second Edition](https://www.packtpub.com/web-development/full-stack-react-projects-second-edition)\n*Learn MERN stack development by building modern web apps using MongoDB, Express, React, and Node.js*\n\n \n\nReact combined with industry-tested, server-side technologies, such as Node, Express, and MongoDB, enables you to develop and deploy robust real-world full-stack web apps. This updated second edition focuses on the latest versions and conventions of the technologies in this stack, along with their new features such as Hooks in React and async/await in JavaScript. The book also explores advanced topics such as implementing real-time bidding, a web-based classroom app, and data visualization in an expense tracking app.\n\nFull-Stack React Projects will take you through the process of preparing the development environment for MERN stack-based web development, creating a basic skeleton app, and extending it to build six different web apps. You'll build apps for social media, classrooms, media streaming, online marketplaces with real-time bidding, and web-based games with virtual reality features. Throughout the book, you'll learn how MERN stack web development works, extend its capabilities for complex features, and gain actionable insights into creating MERN-based apps, along with exploring industry best practices to meet the ever-increasing demands of the real world.\n\nThings you'll learn in this book:\n\n- Extend a MERN-based application to build a variety of applications\n- Add real-time communication capabilities with Socket.IO\n- Implement data visualization features for React applications using Victory\n- Develop media streaming applications using MongoDB GridFS\n- Improve SEO for your MERN apps by implementing server-side rendering with data\n- Implement user authentication and authorization using JSON web tokens\n- Set up and use React 360 to develop user interfaces with VR capabilities\n- Make your MERN stack applications reliable and scalable with industry best practices\n\nIf you feel this book is for you, get your [copy](https://www.amazon.com/dp/1839215410) today!\n\n---\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "bitovi/documentjs", "link": "https://github.com/bitovi/documentjs", "tags": [], "stars": 596, "description": "The sophisticated documentation engine", "lang": "JavaScript", "repo_lang": "", "readme": "# DocumentJS\n\n[![Gitter](https://badges.gitter.im/Join%20Chat.svg)](https://gitter.im/bitovi/documentjs?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)\n\n[![Build Status](https://travis-ci.org/bitovi/documentjs.svg?branch=master)](https://travis-ci.org/bitovi/documentjs)\n[![Build status](https://ci.appveyor.com/api/projects/status/f2e9ho3cwx98hajp/branch/master?svg=true)](https://ci.appveyor.com/project/matthewp/documentjs/branch/master)\n\nDocumentJS creates beautiful, articulate, multi-versioned documentation. With DocumentJS, you can:\n\n - Write documentation inline or in markdown files.\n - Specify your code's behavior precisely with JSDoc\n and [Google Closure Compiler](https://developers.google.com/closure/compiler/docs/js-for-compiler)\n annotations.\n - Customize your site's theme and layout.\n - Generate multi-version documentation.\n\nGo to [documentjs.com](http://documentjs.com) for guides and documentation.\n\n\n## Changelog\n\n### 0.2.0 _Nov 27th, 2014_\n\n - Added the `tags` [site config](http://documentjs.com/docs/DocumentJS.siteConfig.html). It allows custom tags.\n - Options on non record types [#72](https://github.com/bitovi/documentjs/issues/72).\n - `` is bold [#76](https://github.com/bitovi/documentjs/issues/76).\n - Sidebar parents fixed [#75](https://github.com/bitovi/documentjs/pull/75).\n - Code and Src are available almost everywhere [commit](https://github.com/bitovi/documentjs/commit/d51f8fb09e06c58fe8e12bd8ea6b93c7197c5ae1).\n - Added `singlePage` [site config](http://documentjs.com/docs/DocumentJS.siteConfig.html). [commit](https://github.com/bitovi/documentjs/commit/0ccfbffbd5b84de0c433e2102c84c6e56059426d)\n \n\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "aaronlumsden/progression.js", "link": "https://github.com/aaronlumsden/progression.js", "tags": [], "stars": 595, "description": "A jQuery plugin that gives users real time hints & progress updates as they complete forms", "lang": "JavaScript", "repo_lang": "", "readme": "Progression.js\n==============\n\nA jQuery plugin that gives users real time hints & progress updates as they complete forms\n\n\n### Documentation\n\n#### ..:: Getting Started\n\n##### Include the relevant files\n\nFirstly include jQuery and the progression.css and progress.js files.\nPlace these before `` section\n\n \n \n \n\n\n##### Create a form\n\nYou must give your form a unique ID. You then need to add a data\nattribute of `data-progression` to each element that needs to be a step\nin the form progression.\n\nThe helper text for the tooltip can be set by adding `data-helper` to\nthe element. This is demonstrated below\n\n \n\n ##### Initiate the plugin\n\nOnce you have created your form you will need to initiate the plugin.\n\nAt its most basic level you can initiate the plugin like:\n\n $(document).ready(function ($) {\n \n $(\"#myform\").progression();\n \n });\n \n\nIf you want to initiate the plugin with options then you can do so like:\n\n $(\"#myform\").progression({\n tooltipWidth: '200',\n tooltipPosition: 'right',\n tooltipOffset: '50',\n showProgressBar: true,\n showHelper: true,\n tooltipFontSize: '14',\n tooltipFontColor: 'fff',\n progressBarBackground: 'fff',\n progressBarColor: '6EA5E1',\n tooltipBackgroundColor: 'a2cbfa',\n tooltipPadding: '10',\n tooltipAnimate: true\n }); \n\n\n#### ..:: Options\n\n\n \n \n Variable \n\n Default Value \n\n Description \n\n Valid Options \n \n \n\n \n \n tooltipWidth \n\n 200 \n\n The width in pixels that you would like the tooltip to be \n\n \n \n\n \n tooltipPosition \n\n right \n\n Whether the tooltip should sit to the left or right of the form \n\n left/right \n \n\n \n tooltipOffset \n\n 50 \n\n The width in pixels that you would like the offset of the tooltip to be \n\n \n \n\n \n showProgressBar \n\n true \n\n Whether the progress bar should be displayed or not \n\n true/false \n \n\n \n showHelper \n\n true \n\n Whether the helper text should be shown or not \n\n true/false \n \n\n \n tooltipFontSize \n\n 14 \n\n Set the font size of the helper text in pixels \n\n \n \n\n \n tooltipFontColor \n\n ffffff \n\n The hash color reference of the helper text \n\n \n \n\n \n progressBarBackground \n\n ffffff \n\n The hash color reference of the progress bar background \n\n \n \n\n \n progressBarColor \n\n 6EA5E1 \n\n The hash color reference of the progress bar \n\n \n \n\n \n tooltipPadding \n\n 10 \n\n The padding for the tooltip in pixels \n\n \n \n\n \n tooltipAnimate \n\n true \n\n Whether to animate the tooltip or not \n\n true/false \n \n \n
\n\n\n \n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "crypto-browserify/crypto-browserify", "link": "https://github.com/crypto-browserify/crypto-browserify", "tags": [], "stars": 596, "description": "partial implementation of node's `crypto` for the browser", "lang": "JavaScript", "repo_lang": "", "readme": "# crypto-browserify\n\nA port of node's `crypto` module to the browser.\n\n[![Build Status](https://travis-ci.org/crypto-browserify/crypto-browserify.svg?branch=master)](https://travis-ci.org/crypto-browserify/crypto-browserify)\n[![js-standard-style](https://cdn.rawgit.com/feross/standard/master/badge.svg)](https://github.com/feross/standard)\n[![Sauce Test Status](https://saucelabs.com/browser-matrix/crypto-browserify.svg)](https://saucelabs.com/u/crypto-browserify)\n\nThe goal of this module is to reimplement node's crypto module,\nin pure javascript so that it can run in the browser.\n\nHere is the subset that is currently implemented:\n\n* createHash (sha1, sha224, sha256, sha384, sha512, md5, rmd160)\n* createHmac (sha1, sha224, sha256, sha384, sha512, md5, rmd160)\n* pbkdf2\n* pbkdf2Sync\n* randomBytes\n* pseudoRandomBytes\n* createCipher (aes)\n* createDecipher (aes)\n* createDiffieHellman\n* createSign (rsa, ecdsa)\n* createVerify (rsa, ecdsa)\n* createECDH (secp256k1)\n* publicEncrypt/privateDecrypt (rsa)\n* privateEncrypt/publicDecrypt (rsa)\n\n## todo\n\nthese features from node's `crypto` are still unimplemented.\n\n* createCredentials\n\n## contributions\n\nIf you are interested in writing a feature, please implement as a new module,\nwhich will be incorporated into crypto-browserify as a dependency.\n\nAll deps must be compatible with node's crypto\n(generate example inputs and outputs with node,\nand save base64 strings inside JSON, so that tests can run in the browser.\nsee [sha.js](https://github.com/dominictarr/sha.js)\n\nCrypto is _extra serious_ so please do not hesitate to review the code,\nand post comments if you do.\n\n## License\n\nMIT\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "hihayk/shaper", "link": "https://github.com/hihayk/shaper", "tags": [], "stars": 595, "description": "interface styles shaper", "lang": "JavaScript", "repo_lang": "", "readme": "# [SHAPER](https://hihayk.github.io/shaper/) \u2014 interface styles shaper\n\n![](https://github.com/hihayk/shaper/blob/master/public/shaper-editing.gif?raw=true)\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "BlokDust/BlokDust", "link": "https://github.com/BlokDust/BlokDust", "tags": [], "stars": 595, "description": ":musical_keyboard: A free to use web-based music making app. Make sounds, build instruments and share your creations.", "lang": "JavaScript", "repo_lang": "", "readme": "BlokDust\n========\n\n[![Build Status](https://travis-ci.org/BlokDust/BlokDust.svg)](https://travis-ci.org/BlokDust/BlokDust)\n \n\n[BlokDust](https://blokdust.com) is a web-based music making app. By joining blocks together, you can build synthesizers, put effects on your voice, remix & manipulate samples and arrange self-playing musical environments.\n\nYou can share what you make and if you want, you can expand on other people's creations. You can also contribute by tagging your tracks in SoundCloud with #blokdust so that your music is available for people to play with in the app.\n\nPlay the app at [blokdust.com](https://blokdust.com)\n\n![BlokDust](https://guide.blokdust.com/wp-content/uploads/2016/03/synth01_4b.jpg \"Creating a synth in BlokDust\")\n\n\nVisit our [Youtube channel](https://www.youtube.com/channel/UCukBbnIMiUZBbD4fJHrcHZQ) to view some examples of the app in action and follow us on [Facebook](https://www.facebook.com/blokdust) and [Twitter](https://twitter.com/blokdust) for updates. Also, share your creations with each other on the [BlokDust Subreddit](https://www.reddit.com/r/blokdust).\n\n\n\n# Guide\nFor tutorials, examples and other related features please visit our wiki/user companion site [guide.blokdust.com](https://guide.blokdust.com)\n\n\n\n# Frequently Asked Questions\n[guide.blokdust.com/frequently-asked-questions/](https://guide.blokdust.com/frequently-asked-questions/)\n\n\n\n# Requirements \nChrome browser recommended, desktop or tablet and an internet connection. If you have one, use a MIDI key controller for the best experience!\n\n\n\n# Contributing\n\nWe\u2019ll hopefully be publishing some developer notes when we have time. There\u2019s a lot of things we\u2019ve explored for the first time with this project and plenty that can be improved so we welcome any extra insight.\n\n#### Clone the repository \n`git clone https://github.com/BlokDust/BlokDust.git`\n\n#### Prerequisites \nBlokdust needs these things to work:\n- [Node](https://nodejs.org) `v12.22.0`\n- [Grunt](http://gruntjs.com/getting-started) `npm install -g grunt@v0.4.5`\n\n#### Workflow\n`npm install` to install the dependencies. \n\n##### Serve Blokdust on localhost:8000:\n`grunt serve:dev` \n\n##### Serve Blokdust on localhost:8000 and update on changes:\n`grunt watch:dev` \n\n##### Build & compile project to dist folder: \n`grunt dist`\n\n\n# Contact\nGet in touch with us at [blokdust@gmail.com](mailto:blokdust@gmail.com)\n", "readme_type": "markdown", "hn_comments": "tl;drOver the last few years, what used to be a flood of high-profile releases slowed to a trickle, however, and by 2022 it was more like a slow drip.Most people\u2019s first answer would probably be the pandemic. But the truth is, the AAA landscape has been shrinking for years. Everything has gotten too big, too expensive. The math is simple: games take longer to make, and need more developers to make them, so we are getting fewer of them.I've managed to get my hands on a PS5. What I'm playing the most is a PS4 release. Thank the stars, that thing is backward compatible.They will never surpass the most excellent Indian productionhttps://www.youtube.com/watch?v=N8OJB5qLZ6oAh, and I see you can watch RRR on Netflix.It's very cool to see Blender grow bigger each year and moving into the mainstream movie production.I can recommend Blender Bob's YouTube channel, he has worked in the VFX industry for many years and primarily uses Blender and shows how VFX is made:https://www.youtube.com/c/BlenderBobI hope to have one of these articles about FreeCAD one day. Maybe the stakes are lower for movies than engineering and that's why it hasn't happened yet.I watched this movie and visual feast for audience. Nice to see they used Blender pipeline for Visuals.There is also \"I Lost My Body\" which is full-length movie made in Blender. An interview with the director J\u00e9r\u00e9my Clapin[1] gives a bit of the backgound of the project.Both \"RRR\" [2] and \n\"I Lost My Body\" [3] are on Netflix.[1] https://www.blender.org/user-stories/i-lost-my-body-a-stunni...[2] https://www.netflix.com/us/title/81476453?s=i&trkid=13747225...[3] https://www.netflix.com/us/title/81120982?s=i&trkid=13747225...Who cares if they used Blender or Maya. As an Indian (and a South Indian at that), I must say I\u2019m embarrassed at the level of attention this movie is getting for its over the top use of cringey special effects. There are so many other worthwhile Indian films to watch and enjoy.The film certainly took the level of art and craft in Indian cinema several notches up. Pleasantly surprised that some of the visual effects used the Blender pipeline. Very cool.Multiple VFX companies worked on RRR. This article is about one of them, which used Blender.This movie is apparently available to U.S. audiences only on Netflix with a Hindi dub. I'd really like to see it in its original language with English subtitles. As well, the total size of the UHD stream is about 14 GB which isn't going to be the best quality for a 3 hour runtime.https://old.reddit.com/r/Bluray/comments/v3fzni/rrr_bluray/I'll watch it anyway. Looks like a fun time.What type of computers do they use to render these FX these days?Title is misleading. It was made primarily in Blender but not entirely in it.From the article> Blender was used in our entire pipeline aside from the FX departmentHoudini was likely used for FX work since Blender isn\u2019t quite competitive with it yet.Additionally, this is about a single studio\u2019s pipeline and not reflective of all the total work involved.Nothing but masterful! Blender is a tool to be reckon withExcited to see this posted here, I'm the author of the Cycles for Max plugin mentioned in the article. I was delighted when I first heard it was used in RRR.https://cyclesformax.netI don't understand the big hoo ha about the special effects in this. They look tacky and fake, like things I've seen in advertisements years ago.Love Blender, but in my experience, the biggest advantage that 3DS Max has over Blender is its ability to handle massive scenes. In VFX, especially those involving explosions, dust, crowds and suchlike, this can be a deal breaker.For anyone unfamiliar with the movie or seeking some perspective on the film, I did very much enjoy this review/retrospective of RRR by Patrick H Willems, a channel which I generally enjoy. The same video is also on Nebula for folks with a subscription.https://www.youtube.com/watch?v=dPU2D5FtjbwThe Critical Drinker - a YouTube movie reviewer who is very very selective about what he calls a good movie, says:\"RRR is the best movie you've never seen\"https://www.youtube.com/watch?v=HKN6FAKjFPUIf only 10% of the efforts going into special effects these days went into the actual story ...Eye candy can contribute only so much to a movie. Spend 10x the budget on special effects and the movie will get at most 1% better, sometimes it gets even worse. Superman 1 was much more enjoyable than most movies that came out of Marvel the last years.I can't finish watching this movie because I do not want to see the two dudes fight each other. lol. I want to live in a world where they still BFFs.Unrelated:If you enjoyed RRR and it's over the top drama, action sequences you might also like following movies.1) Bahubali 1 & 2 (same director)https://www.youtube.com/watch?v=G62HrubdD6o2) KGF chapter 1 and 2https://www.youtube.com/watch?v=Qah9sSIXJqkFunny someone is doing the same with Radioshack and Crypto [0].[0] https://www.radioshack.org/In 2007 I was in the US Army, deployed to the Middle East. (My memory is a bit hazy, it could have been my 2009 deployment instead.) Netflix was shipping DVD's to rent but they wouldn't ship to my APO AE address. I discovered that Blockbuster had a similar program, and they would ship to my APO AE address. So for six months or so, I was renting movies from Blockbuster on the other side of the world.I wish they had included pictures of his hardware.I miss Blockbuster. We had a small haven here called Family Video, but they unfortunately closed down due to the COVID lockdowns/lack of foot traffic. It was my family's spot to grab a Friday night movie and snacks, even through the pandemic. We were super bummed to see it go. Streaming just doesn't match the experience of browsing movies IRL. We'd usually leave with a few movies to watch that week, including a kids movie for the kids. We typically give up on endlessly browsing Netflix/etc. for something interesting and fall back to our \"usual.\" Boring.Hoping similar stores make a resurgence someday.I read this title as \"Piano man...\" several times until my brain finally parsed it correctly. It made for very curious thoughts on an alternate future involving Billy Joel.It's like the business version of the Byzantine Empire.I wonder how many conversations there were in the 80s or 90s about how video on demand would never become mainstream because of the bandwidth requirements.I think that physically going to _any_ special destination such as for tourism may be on a steep decline in the next few decades. VR glasses and goggles will be coming out that are very lightweight, comfortable, and convincing. We will also have eye tracking and eye contact in VR. There will be more advanced, faster more realistic 3d scanning of locations. There will also be the ability to \"live scan\", transmit and faithfully reproduce people moving around in an area. This will take advantage of advancements in graphics and AI.Haptic glove technology will improve.The conversation will be something like \"remember when you had to actually _travel_ 5000 miles to see the Sistine Chapel or the last Blockbuster?\"Un-paywalled: https://archive.ph/jotV3I run IT for a franchise chain. All requests to me are supposed to come through corporate, but franchisees and sometimes managers and even front desk employees try to do end-runs and get directly to me with software issues, local networking issues, etc. I want to stress how difficult this would be for one person to manage, and how much work for little pay it probably was if it was based on service contracts with individual stores. At least, when there were 30 independent Blockbusters left. Now it's probably pretty chill.I am old enough to remember when the wife would call me at work on a Friday and say hey, stop by the video store and get a movie for tonight and I'll stop and get wine and take out. What was special was the \"Dave\" at the video rental place (it wasn't blockbuster but whatever). I don't recall the guy's name, but he always remembered me and what we liked and had great suggestions and sometimes if you said, nah but how about something completely different, he would always find us something great. Or usually (one time he recommended Eraser head, we still laugh about that one, so even though it was a dud, great memory).I get it, today's AI is pretty good, meh, not really, but it would be hard to beat my old \"Dave\". Maybe someday.Have you ever seen the post with the person that accidentally stumbled into a film set with an old Blockbuster \"revived\", hehhttps://imgur.com/gallery/n1l3O58Do you think he had to recreate blockbuster's central servers from scratch or did he get the blessing to maintain the old software?Growing up in the 80s with VHS, Betamax, Laserdiscs (if anyone recalls), and being a dj in the late 90s when the thought of a \"USB stick instead of traveling with all this vinyl was an impossibility\", makes this whole nostalgia tour a fun one. I think we all forget though just how poor the quality was back then, and what we've become accustomed to, with VHS being 240 lines, DVD 480p, etc. It's like reminiscing about the first iPhone and then looking at one and realizing how damn small it actually was compared to modern versions.I started converting / collecting most of my movie collection onto a localized server years ago, and glad I did. Though I rarely watch all my old movies (a growing list of about 1000 including most of my favorite TV shows), the end game I think we all know is everything streamed, with no actual ownership of content. It's not a terrible notion, but the problem I think we've all seen is it's now turned into a corporate ownership game, and you never know where the content you're interested in watching is. One day Star Trek is on Netflix, the next Paramount, etc.The only problem has been keeping up with resolution changes, even though I'm a firm believer in unless you're watching on something well over 100\" a nice high-quality 1080P file looks just great on a large 85\" tv (which I currently have).My fondest Blockbuster memories were around video games. The very best weekends were when my mom would take us there on Friday after school and let us each pick out a game. My sister and I would fight for turns with the TV all weekend.Just getting to explore a new game for a little bit and then try something else next time was so much fun; you never knew what you were gonna get, just going off the box art. Even better was during the cartridge years, when you'd take a game home and it might already have a few different saves on it from other people, and you'd get to visit their characters and worlds, jump in at different points in the game, and try to imagine how it all fit together.There was a gap of many years between Netflix killing Blockbuster, and game subscriptions becoming a thing where you could try them out casually again. Though of course even then, all the same physical nostalgia is missed; browsing the aisles, scrutinizing the boxes to try and figure out what it'll be like when you take it home, etc.If you want your mind blown - \"Who really killed Blockbuster Video?\"https://podcasts.apple.com/us/podcast/who-really-killed-bloc...It's amazing that Blockbuster had Netflix on the ropes, until one notorious activist investor showed up and basically gave Netflix the win.One man is responsible for creating a completely different timeline when it comes to video streaming. Very similar to how we would be living in a different reality if GM's ahead-of-its-time EV1 was not mysteriously disappeared in the 90s.> \u201cIt\u2019s the thing that wakes me up in the middle of the night: \u2018Oh my gosh, what happens if we run out of computers?\u2019\u201d he said. \u201cWell, it is what it is. It\u2019s just going to run until it doesn\u2019t run anymore.\u201dCouldn't he just virtualize? There is USB-Floppy devices, these days, to get even the last bit of compatibility right.It's sort of bizarre that an entrenched, widely despised corporate behemoth thoroughly deserving its own demise has turned into an anachronistic mom-and-pop shop that just gets by. But isn't this the worst of both worlds? It's dystopian nostalgia. Maybe I just have too much of a grudge against the 1980's...I worked at a corporate Blockbuster for just over a year during high school. I made $7.15/hr and I commuted using my bicycle. Best job I ever had.Welp at least he doesn\u2019t have any scaling issues. Scaled down to the minimal set.I was in Bend a few months ago and swung by the last Blockbuster. I rented a movie (and bought a t-shirt), and had to sign up for an account. I received a laminated paper Blockbuster card with an account number scrawled in Sharpie on the back. I wouldn't have guessed that the store was still calling out to these old servers; that's pretty cool.The movie cost 99 cents to rent, which I thought was surprisingly cheap. The clerks were talking about how people come in to take pictures (no surprise there) and were usually inconsiderate about including the clerks in photos.It smelled exactly the same in there. It was neat.Stan MarshDon't know if it's been mentioned or not, but what killed the video store for me was the late fees. Hollywood Video had a parking lot drop box that was good up until midnight. I worked the overnight shift so I would drop the seen vids at 9:15pm and then drive off to work. Almost every time they would charge me a late fee! When I would call them on the fee, they always dropped it, making me wonder about the other people who wouldn't protest.Also, many times I witnessed exasperated parents and grandparents paying a huge late fee because their kids forgot to drop them off.What if the blood collectors just find a way to filter out PFAs so they don\u2019t get into your donation and they collect a pure sample, and overtime you just keep saturating your blood with PFAs through constant donations? No thanks.Blood letting, huh?why not just get regularly blood tested? blood testing also extracts blood. but on top of it, you get informationDonate your perfluoroalkyl and polyfluoroalkyl substances (PFAS) to those in need!Hmm:> As to the question of what happens to recipients of this blood: \u201cPerfluoroalkyl and polyfluoroalkyl substances are ubiquitous, and no threshold has been identified that poses an increased risk to recipients of donated blood components.\u201dThis means anybody who gets a blood donation is getting a nice dose of forever chemicals.What is the optimal donation interval? Article mentions that some donated every 6 weeks and others every 12 but it doesn't appear to mention how the interval affected the forever chemical levels. Is more frequent donation better because it's more opportunity to remove the forever chemicals?Back to leeches; got itThe article mentions the reduction, not beginlevels of the substances. It turns out that the reduction during the test period is about 30% (for at least one of the substance categories). A meaningful reduction for sure, but it should be interesting to test them after a few years of blood donation. On page 6/7 of the JAMA pdf you find the relevant graph.https://jamanetwork.com/journals/jamanetworkopen/articlepdf/...This is not the only health benefit of regular blood donations:Donation of Blood Is Associated with Reduced Risk of Myocardial Infarctionhttps://academic.oup.com/aje/article/148/5/445/76921Regular blood donation may help in the management of hypertensionhttps://onlinelibrary.wiley.com/doi/abs/10.1111/trf.13428Regular blood donation may lower iron stores, and this in turn lowers lipid peroxidation.https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3663474/Sharing is caring?Or letting a dozen leeches have their way with your chestmeat.Those medieval sawbones knew some stuff.I don't like how \"toxic forever chemicals in the bloodstream\" is a normal thingVeritasium has a great YouTube video about this: \"The Man Who Accidentally Killed The Most People In History\"[0][0] - https://www.youtube.com/watch?v=IV3dnLzthDAAKA bloodlettingAre there any blood test as a service sites? \nWhere I can point and click order a blood test panel ?We should be able to test our blood for PFAS on a regular basis and our doctors should be able to let us know if we need to filter our blood.But PFAS testing is not the norm?Per a Tedx talk on this subject:\nhttps://www.nrdc.org/experts/anna-reade/pfas-blood-tests-nee...>> Unfortunately, a PFAS blood test is not a routine lab test that can be processed by your local lab and covered by your health insurance. In fact, there are only a handful of labs certified to test for PFAS in blood and the out of pocket cost for this test is out of reach for most people.I wonder how less effective double red donations are at this since they return your plasma to you during the donation. Sounds like plasma donations are the biggest bang for the buck, possible followed by single whole blood donations?Sucks to be gay in the United States then. FDA prohibits gay men from donating blood based on outdated fears of contamination.I would not have guessed that bleeding would come back as a medical treatment in the year 2022.So, the solution to pollution REALLY IS dilution.This might be great! Or it might be meaningless. The only quantifying passage I see is this one:> \u201cPlasma donation was the most effective intervention, reducing mean serum perfluorooctane sulfonate levels by 2.9 ng/mL compared with a 1.1-ng/mL reduction with blood donation, a significant differenceBut how much perfluorooctane sulfonate did the study's subjects start with? I don't care if one treatment accomplished more than another until I know that at least one of them made a difference worth making.I know some people are jumping on the idea that these \"forever chemicals\" just get passed onto the blood recipient, but that misses the \"Regular\" part. Sure, the first few donations have the higher concentrations of those chemicals, but subsequent donations have fewer forever chemicals. The more you do it, the better quality it becomes.I've been waiting for the Giving Plague to become real:https://www.davidbrin.com/fiction/givingplague.htmllol this is literally bloodletting. Like what people in the olden days did. Lol, comes back full circle.Having grown up in Germany I cannot donate blood here in the US (mad cow) so I guess I'll stew in my toxins...How about donating platelets? Due to my blood type, they want my platelets.Finally, a reason for me to donate.Another way to say that Bloodletting can reduce toxic forever chemicals in the bloodstream.It looks like the school blood letting is the way to health. And scientists laughed at people using leeches back in the day.At least about 15 years ago, we really had no idea the long-term effects of donating blood as often as the Red Cross allows over multi-year periods.I'm negative for cytomegalovirus, which makes my blood suitable for newborns, cancer patients, HIV patents, etc. I used to donate as often as the Red Cross/New York Blood Center would let me (donating plasma in the in-between times when it was too soon to donate whole blood), but at some point that forced me to get off of my red-meat-once-a-week diet because my iron levels were bouncing down around the donation limits.I didn't realize one check-up with my regular doctor was going to involve a blood screening, so when the nurse drew blood, I told her that I was going to test as anemic since I donated less than a week earlier. I saw the nurse enter my comments into the computer system, but I still got a very concerned call from my doctor a week later about being anemic. The doctor hadn't read the comment on my chart that I had donated recently, but still begged me to eat more red meat and liver and/or cut back on my blood donation frequency. I asked about spinach, broccoli, etc., and she said it was just very difficult to get enough iron that way if I was donating blood that often.I called up the New York Blood Center to see if my regular doctor was over-reacting and/or they had any suggestions about keeping my iron high while donating as often as possible and still keeping red meat consumption low. They told me that there weren't any long-term studies on people who donate as often as they're allowed over long periods. Also, they reaffirmed that red meat and liver are really much more dense sources of iron than any plant sources.Anyway, when you donate whole blood/plasma regularly and you're CMV-negative, they cross-check your blood for leukocyte compatibility with immunocompromised patients. Three times, I was asked to donate leukocytes for kids with cancer. Two times I was able to donate leukocytes, and the third time, my hematocrit was too low. I highly recommend it, if nothing else, you feel like a really decent person for a couple of weeks after donating leukocytes to a kid with cancer. It's a pretty low-effort way to leave the world a little better than you found it.I moved to Hong Kong for a while, and there, they take 400 mL instead of a 500 mL, and let you donate less often, but the minimum body weight limit for donation is lower. (People there are smaller on average and eat less red meat than the average American.) I was also very involved with dragon boat racing there, and being slightly anemic makes training harder, so I only donated once a year in HK. I really should have donated more in the off season.Now, I've got a young son who's blood-compatible with me, so even though they'd never run my blood directly into my son, my wife would really prefer that I keep my blood in my veins for emergencies. I'd still prefer to donate blood, but my wife really doesn't ask that much from me, and it's a small thing to give up for a few years.So we are back to bloodletting?As a regular donator, it's really important to watch your iron levels. After donating double red blood cells every time I could for two years my iron levels dropped significantly and had a very negative impact on my quality of life until I figured out what was wrong.Now I eat more spinach and chicken to give myself a boost but I'm taking a little break to recoup before going back again.Either way, donating's great, totally should do it, it's super essential and there're always shortages. Just make sure you're supporting yourself if you make it a habit.Canadian blood services specifically doesn't allow me to donate blood. Admittedly, I fear needles pretty fiercely anyway so it's not something I'm going to be fighting.I find it interesting though that the argument to donate blood is that one's blood might be toxic. That giving away your toxic blood might make your own blood less toxic.So the pfas just mostly hang out in the bloodstream?That seems kind of strange they never get filtered or end up in organs.I've been wondering about this for a long time.I get really anxious when it come to needles being put in me. The article only mentions blood donations (presumably because they are trying to advocate for some behavior, not just provide science) but I assume any blood letting would work the same. IIRC fishers would check their bodies afterwards due to leaches sucking blood without them noticing. I wonder how beneficial and pain free it would be to receive leach treatment?This is the first example of detoxifying your body might actually be real.There\u2019s a theory (see slimemoldtimemold.com) that PFAS cause obesity (or, rather, cause increased hunger which in turn causes obesity). This seems like a great way to actually test that hypothesis.\u201cForever chemicals\u201d is kind of a scary marketing term that helps sell ad views.PFAS materials (teflon, high temperature o-rings, refrigeration fluids, fire-fighting foams, anti-cancer drugs and other medicines, etc.) encompass such a broad range of chemistries (anything with the fluorine atom attached to carbon) that have many many orders of magnitude differences in half-life.It\u2019s analogous to labeling anything radioactive (bananas) as a \u201cforever chemical\u201d because some radioactive nuclear waste does have >10k year half-lives.The nice thing about PFAS compared to nuclear though is that even the \u2018forever\u2019 versions that have really long half-lives can be easily catalyzed to break down using the right materials/temperature/pressure.People are looking for catalysis processes to get rid of nuclear waste using atto-second lasers: https://bigthink.com/the-present/laser-nuclear-waste/\n(Would be amazing if it works)Also, fluorine incorporation in water is starting to be reconsidered by some in the medical community (greatest volume of human fluorine consumption by far): https://pubmed.ncbi.nlm.nih.gov/8169995/Takeaway: It's possible to reduce them.That's good. The rest: Eh. It's fairly weak as some kind of medical recommendation if you ask me.American Red Cross keeps donor DNA for \"medical research\" [0].Eventually, DNA sequencing will become extremely cheap and Red Cross will sequence all of their samples. Then the DNA data will get stolen, as happened to the Red Cross's refugee database recently [1].Therefore, people who value their privacy and the privacy of their relatives may wish to use blood-letting instead of blood donation.[0] https://www.redcrossblood.org/content/dam/redcrossblood/cont...[1] https://www.icrc.org/en/document/sophisticated-cyber-attack-...And.. pass them on to some other poor bloke?Anyway it does seem like certain stuff seems to build up in the blood-stream over the years, with the body having no way of removing it, and that bleeding and receiving plasma effectively gets rid of it.I wonder if this could be considered as a real medical treatment and offered to people.There's also been some recent evidence that donating blood can increase lifespan in men, with lowering serum iron levels suggested as the causal link.Nice try, vampires!Could you just filter your blood externally, dialysis-style, and get rid of these chemicals that way?Sheesh, I can't believe the percent of negative comments! Someone finally figured a way to reduce PFAS. It's a hugely positive finding, especially in light of people being able to keep their red blood cells. Hopefully it can be ruled as a safe long-term solution for now. Looking forward to reading updates on this research.And btw, donating plasma doesn't mean other people will get your PFAS; many things can be done with plasma besides injecting into other people.I just want to say that those medieval doctors, I guess, were on to something! Blood letting is a thing again!On that note I guess leaches would work too.So does this imply that women have lower levels than similarly exposed men?Shouldn't this mean that women have fewer of these chemicals due to losing blood regularly on their cycle?I shall have my servant bring me my leeches at once!I used to donate blood regularly. I would donate more regularly if I got a full blood panel with each donation. I've been quoted $600-2500. Surely you can comp that for me if I give you a pint.Blood letting then. Is that the same?Wow. Before, I was selling plasma to afford cigarettes while serving in AmeriCorps. Maybe now I'll sell plasma to detox instead.", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "postcss/postcss-custom-properties", "link": "https://github.com/postcss/postcss-custom-properties", "tags": [], "stars": 595, "description": "Use Custom Properties in CSS", "lang": "JavaScript", "repo_lang": "", "readme": "\n\n# PostCSS Custom Properties [ ][postcss]\n\n[![NPM Version][npm-img]][npm-url]\n[![CSS Standard Status][css-img]][css-url]\n[![Build Status][cli-img]][cli-url]\n[![Support Chat][git-img]][git-url]\n\n[PostCSS Custom Properties] lets you use Custom Properties in CSS, following\nthe [CSS Custom Properties] specification.\n\n[!['Can I use' table](https://caniuse.bitsofco.de/image/css-variables.png)](https://caniuse.com/#feat=css-variables)\n\n```pcss\n:root {\n --color: red;\n}\n\nh1 {\n color: var(--color);\n}\n\n/* becomes */\n\n:root {\n --color: red;\n}\n\nh1 {\n color: red;\n color: var(--color);\n}\n```\n\n**Note:** This plugin only processes variables that are defined in the `:root` selector.\n\n## Usage\n\nAdd [PostCSS Custom Properties] to your project:\n\n```bash\nnpm install postcss-custom-properties --save-dev\n```\n\nUse [PostCSS Custom Properties] to process your CSS:\n\n```js\nconst postcssCustomProperties = require('postcss-custom-properties');\n\npostcssCustomProperties.process(YOUR_CSS /*, processOptions, pluginOptions */);\n```\n\nOr use it as a [PostCSS] plugin:\n\n```js\nconst postcss = require('postcss');\nconst postcssCustomProperties = require('postcss-custom-properties');\n\npostcss([\n postcssCustomProperties(/* pluginOptions */)\n]).process(YOUR_CSS /*, processOptions */);\n```\n\n[PostCSS Custom Properties] runs in all Node environments, with special instructions for:\n\n| [Node](INSTALL.md#node) | [PostCSS CLI](INSTALL.md#postcss-cli) | [Webpack](INSTALL.md#webpack) | [Create React App](INSTALL.md#create-react-app) | [Gulp](INSTALL.md#gulp) | [Grunt](INSTALL.md#grunt) |\n| --- | --- | --- | --- | --- | --- |\n\n## Options\n\n### preserve\n\nThe `preserve` option determines whether Custom Properties and properties using\ncustom properties should be preserved in their original form. By default, both\nof these are preserved.\n\n```js\npostcssCustomProperties({\n preserve: false\n});\n```\n\n```pcss\n:root {\n --color: red;\n}\n\nh1 {\n color: var(--color);\n}\n\n/* becomes */\n\nh1 {\n color: red;\n}\n```\n\n### importFrom\n\nThe `importFrom` option specifies sources where Custom Properties can be imported\nfrom, which might be CSS, JS, and JSON files, functions, and directly passed\nobjects.\n\n```js\npostcssCustomProperties({\n importFrom: 'path/to/file.css' // => :root { --color: red }\n});\n```\n\n```pcss\nh1 {\n color: var(--color);\n}\n\n/* becomes */\n\nh1 {\n color: red;\n}\n```\n\nMultiple sources can be passed into this option, and they will be parsed in the\norder they are received. JavaScript files, JSON files, functions, and objects\nwill need to namespace Custom Properties using the `customProperties` or\n`custom-properties` key.\n\n```js\npostcssCustomProperties({\n importFrom: [\n 'path/to/file.css', // :root { --color: red; }\n 'and/then/this.js', // module.exports = { customProperties: { '--color': 'red' } }\n 'and/then/that.json', // { \"custom-properties\": { \"--color\": \"red\" } }\n {\n customProperties: { '--color': 'red' }\n },\n () => {\n const customProperties = { '--color': 'red' };\n\n return { customProperties };\n }\n ]\n});\n```\n\nSee example imports written in [CSS](test/import-properties.css),\n[JS](test/import-properties.js), and [JSON](test/import-properties.json).\n\n### exportTo\n\nThe `exportTo` option specifies destinations where Custom Properties can be exported\nto, which might be CSS, JS, and JSON files, functions, and directly passed\nobjects.\n\n```js\npostcssCustomProperties({\n exportTo: 'path/to/file.css' // :root { --color: red; }\n});\n```\n\nMultiple destinations can be passed into this option, and they will be parsed\nin the order they are received. JavaScript files, JSON files, and objects will\nneed to namespace Custom Properties using the `customProperties` or\n`custom-properties` key.\n\n```js\nconst cachedObject = { customProperties: {} };\n\npostcssCustomProperties({\n exportTo: [\n 'path/to/file.css', // :root { --color: red; }\n 'and/then/this.js', // module.exports = { customProperties: { '--color': 'red' } }\n 'and/then/this.mjs', // export const customProperties = { '--color': 'red' } }\n 'and/then/that.json', // { \"custom-properties\": { \"--color\": \"red\" } }\n 'and/then/that.scss', // $color: red;\n cachedObject,\n customProperties => {\n customProperties // { '--color': 'red' }\n }\n ]\n});\n```\n\nSee example exports written to [CSS](test/export-properties.css),\n[JS](test/export-properties.js), [MJS](test/export-properties.mjs),\n[JSON](test/export-properties.json) and [SCSS](test/export-properties.scss).\n\n[cli-img]: https://img.shields.io/travis/postcss/postcss-custom-properties/master.svg\n[cli-url]: https://travis-ci.org/postcss/postcss-custom-properties\n[css-img]: https://github.com/postcss/postcss-custom-properties/workflows/test/badge.svg\n[css-url]: https://github.com/postcss/postcss-custom-properties/actions/workflows/test.yml?query=workflow/test\n[git-img]: https://img.shields.io/badge/support-chat-blue.svg\n[git-url]: https://gitter.im/postcss/postcss\n[npm-img]: https://img.shields.io/npm/v/postcss-custom-properties.svg\n[npm-url]: https://www.npmjs.com/package/postcss-custom-properties\n\n[CSS Custom Properties]: https://www.w3.org/TR/css-variables-1/\n[PostCSS]: https://github.com/postcss/postcss\n[PostCSS Custom Properties]: https://github.com/postcss/postcss-custom-properties\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "kriskowal/asap", "link": "https://github.com/kriskowal/asap", "tags": [], "stars": 595, "description": "High-priority task queue for Node.js and browsers", "lang": "JavaScript", "repo_lang": "", "readme": "# ASAP\n\n[![Build Status](https://travis-ci.org/kriskowal/asap.png?branch=master)](https://travis-ci.org/kriskowal/asap)\n\nPromise and asynchronous observer libraries, as well as hand-rolled callback\nprograms and libraries, often need a mechanism to postpone the execution of a\ncallback until the next available event.\n(See [Designing API\u2019s for Asynchrony][Zalgo].)\nThe `asap` function executes a task **as soon as possible** but not before it\nreturns, waiting only for the completion of the current event and previously\nscheduled tasks.\n\n```javascript\nasap(function () {\n // ...\n});\n```\n\n[Zalgo]: http://blog.izs.me/post/59142742143/designing-apis-for-asynchrony\n\nThis CommonJS package provides an `asap` module that exports a function that\nexecutes a task function *as soon as possible*.\n\nASAP strives to schedule events to occur before yielding for IO, reflow,\nor redrawing.\nEach event receives an independent stack, with only platform code in parent\nframes and the events run in the order they are scheduled.\n\nASAP provides a fast event queue that will execute tasks until it is\nempty before yielding to the JavaScript engine's underlying event-loop.\nWhen a task gets added to a previously empty event queue, ASAP schedules a flush\nevent, preferring for that event to occur before the JavaScript engine has an\nopportunity to perform IO tasks or rendering, thus making the first task and\nsubsequent tasks semantically indistinguishable.\nASAP uses a variety of techniques to preserve this invariant on different\nversions of browsers and Node.js.\n\nBy design, ASAP prevents input events from being handled until the task\nqueue is empty.\nIf the process is busy enough, this may cause incoming connection requests to be\ndropped, and may cause existing connections to inform the sender to reduce the\ntransmission rate or stall.\nASAP allows this on the theory that, if there is enough work to do, there is no\nsense in looking for trouble.\nAs a consequence, ASAP can interfere with smooth animation.\nIf your task should be tied to the rendering loop, consider using\n`requestAnimationFrame` instead.\nA long sequence of tasks can also effect the long running script dialog.\nIf this is a problem, you may be able to use ASAP\u2019s cousin `setImmediate` to\nbreak long processes into shorter intervals and periodically allow the browser\nto breathe.\n`setImmediate` will yield for IO, reflow, and repaint events.\nIt also returns a handler and can be canceled.\nFor a `setImmediate` shim, consider [YuzuJS setImmediate][setImmediate].\n\n[setImmediate]: https://github.com/YuzuJS/setImmediate\n\nTake care.\nASAP can sustain infinite recursive calls without warning.\nIt will not halt from a stack overflow, and it will not consume unbounded\nmemory.\nThis is behaviorally equivalent to an infinite loop.\nJust as with infinite loops, you can monitor a Node.js process for this behavior\nwith a heart-beat signal.\nAs with infinite loops, a very small amount of caution goes a long way to\navoiding problems.\n\n```javascript\nfunction loop() {\n asap(loop);\n}\nloop();\n```\n\nIn browsers, if a task throws an exception, it will not interrupt the flushing\nof high-priority tasks.\nThe exception will be postponed to a later, low-priority event to avoid\nslow-downs.\nIn Node.js, if a task throws an exception, ASAP will resume flushing only if\u2014and\nonly after\u2014the error is handled by `domain.on(\"error\")` or\n`process.on(\"uncaughtException\")`.\n\n## Raw ASAP\n\nChecking for exceptions comes at a cost.\nThe package also provides an `asap/raw` module that exports the underlying\nimplementation which is faster but stalls if a task throws an exception.\nThis internal version of the ASAP function does not check for errors.\nIf a task does throw an error, it will stall the event queue unless you manually\ncall `rawAsap.requestFlush()` before throwing the error, or any time after.\n\nIn Node.js, `asap/raw` also runs all tasks outside any domain.\nIf you need a task to be bound to your domain, you will have to do it manually.\n\n```js\nif (process.domain) {\n task = process.domain.bind(task);\n}\nrawAsap(task);\n```\n\n## Tasks\n\nA task may be any object that implements `call()`.\nA function will suffice, but closures tend not to be reusable and can cause\ngarbage collector churn.\nBoth `asap` and `rawAsap` accept task objects to give you the option of\nrecycling task objects or using higher callable object abstractions.\nSee the `asap` source for an illustration.\n\n\n## Compatibility\n\nASAP is tested on Node.js v0.10 and in a broad spectrum of web browsers.\nThe following charts capture the browser test results for the most recent\nrelease.\nThe first chart shows test results for ASAP running in the main window context.\nThe second chart shows test results for ASAP running in a web worker context.\nTest results are inconclusive (grey) on browsers that do not support web\nworkers.\nThese data are captured automatically by [Continuous\nIntegration][].\n\n[Continuous Integration]: https://github.com/kriskowal/asap/blob/master/CONTRIBUTING.md\n\n![Browser Compatibility](http://kriskowal-asap.s3-website-us-west-2.amazonaws.com/train/integration-2/saucelabs-results-matrix.svg)\n\n![Compatibility in Web Workers](http://kriskowal-asap.s3-website-us-west-2.amazonaws.com/train/integration-2/saucelabs-worker-results-matrix.svg)\n\n## Caveats\n\nWhen a task is added to an empty event queue, it is not always possible to\nguarantee that the task queue will begin flushing immediately after the current\nevent.\nHowever, once the task queue begins flushing, it will not yield until the queue\nis empty, even if the queue grows while executing tasks.\n\nThe following browsers allow the use of [DOM mutation observers][] to access\nthe HTML [microtask queue][], and thus begin flushing ASAP's task queue\nimmediately at the end of the current event loop turn, before any rendering or\nIO:\n\n[microtask queue]: http://www.whatwg.org/specs/web-apps/current-work/multipage/webappapis.html#microtask-queue\n[DOM mutation observers]: http://dom.spec.whatwg.org/#mutation-observers\n\n- Android 4\u20134.3\n- Chrome 26\u201334\n- Firefox 14\u201329\n- Internet Explorer 11\n- iPad Safari 6\u20137.1\n- iPhone Safari 7\u20137.1\n- Safari 6\u20137\n\nIn the absence of mutation observers, there are a few browsers, and situations\nlike web workers in some of the above browsers, where [message channels][]\nwould be a useful way to avoid falling back to timers.\nMessage channels give direct access to the HTML [task queue][], so the ASAP\ntask queue would flush after any already queued rendering and IO tasks, but\nwithout having the minimum delay imposed by timers.\nHowever, among these browsers, Internet Explorer 10 and Safari do not reliably\ndispatch messages, so they are not worth the trouble to implement.\n\n[message channels]: http://www.whatwg.org/specs/web-apps/current-work/multipage/web-messaging.html#message-channels\n[task queue]: http://www.whatwg.org/specs/web-apps/current-work/multipage/webappapis.html#concept-task\n\n- Internet Explorer 10\n- Safari 5.0-1\n- Opera 11-12\n\nIn the absence of mutation observers, these browsers and the following browsers\nall fall back to using `setTimeout` and `setInterval` to ensure that a `flush`\noccurs.\nThe implementation uses both and cancels whatever handler loses the race, since\n`setTimeout` tends to occasionally skip tasks in unisolated circumstances.\nTimers generally delay the flushing of ASAP's task queue for four milliseconds.\n\n- Firefox 3\u201313\n- Internet Explorer 6\u201310\n- iPad Safari 4.3\n- Lynx 2.8.7\n\n\n## Heritage\n\nASAP has been factored out of the [Q][] asynchronous promise library.\nIt originally had a na\u00efve implementation in terms of `setTimeout`, but\n[Malte Ubl][NonBlocking] provided an insight that `postMessage` might be\nuseful for creating a high-priority, no-delay event dispatch hack.\nSince then, Internet Explorer proposed and implemented `setImmediate`.\nRobert Kati\u0107 began contributing to Q by measuring the performance of\nthe internal implementation of `asap`, paying particular attention to\nerror recovery.\nDomenic, Robert, and Kris Kowal collectively settled on the current strategy of\nunrolling the high-priority event queue internally regardless of what strategy\nwe used to dispatch the potentially lower-priority flush event.\nDomenic went on to make ASAP cooperate with Node.js domains.\n\n[Q]: https://github.com/kriskowal/q\n[NonBlocking]: http://www.nonblocking.io/2011/06/windownexttick.html\n\nFor further reading, Nicholas Zakas provided a thorough article on [The\nCase for setImmediate][NCZ].\n\n[NCZ]: http://www.nczonline.net/blog/2013/07/09/the-case-for-setimmediate/\n\nEmber\u2019s RSVP promise implementation later [adopted][RSVP ASAP] the name ASAP but\nfurther developed the implentation.\nParticularly, The `MessagePort` implementation was abandoned due to interaction\n[problems with Mobile Internet Explorer][IE Problems] in favor of an\nimplementation backed on the newer and more reliable DOM `MutationObserver`\ninterface.\nThese changes were back-ported into this library.\n\n[IE Problems]: https://github.com/cujojs/when/issues/197\n[RSVP ASAP]: https://github.com/tildeio/rsvp.js/blob/cddf7232546a9cf858524b75cde6f9edf72620a7/lib/rsvp/asap.js\n\nIn addition, ASAP factored into `asap` and `asap/raw`, such that `asap` remained\nexception-safe, but `asap/raw` provided a tight kernel that could be used for\ntasks that guaranteed that they would not throw exceptions.\nThis core is useful for promise implementations that capture thrown errors in\nrejected promises and do not need a second safety net.\nAt the same time, the exception handling in `asap` was factored into separate\nimplementations for Node.js and browsers, using the the [Browserify][Browser\nConfig] `browser` property in `package.json` to instruct browser module loaders\nand bundlers, including [Browserify][], [Mr][], and [Mop][], to use the\nbrowser-only implementation.\n\n[Browser Config]: https://gist.github.com/defunctzombie/4339901\n[Browserify]: https://github.com/substack/node-browserify\n[Mr]: https://github.com/montagejs/mr\n[Mop]: https://github.com/montagejs/mop\n\n## License\n\nCopyright 2009-2014 by Contributors\nMIT License (enclosed)\n\n", "readme_type": "markdown", "hn_comments": "", "gh_updated_time": "", "gh_accessed_time": "", "hn_accessed_time": ""}, {"name": "Imangazaliev/git-tips", "link": "https://github.com/Imangazaliev/git-tips", "tags": ["git", "git-tips", "tips", "git-cheatsheet"], "stars": 594, "description": "\u0427\u0430\u0441\u0442\u043e \u0438\u0441\u043f\u043e\u043b\u044c\u0437\u0443\u0435\u043c\u044b\u0435 \u0442\u0440\u044e\u043a\u0438 \u0438 \u0441\u043e\u0432\u0435\u0442\u044b \u043f\u0440\u0438 \u0440\u0430\u0431\u043e\u0442\u0435 \u0441 Git", "lang": "JavaScript", "repo_lang": "", "readme": "# Git Tips\n\nFrequently used tricks and tips when working with Git.\n\nWant to add to the list? Check out [CONTRIBUTING.md](CONTRIBUTING.md)\n\n[English](http://git.io/git-tips) | [\u4e2d\u6587](https://github.com/521xueweihan/git-tips) | [Russian](https://github.com/Imangazaliev/git-tips)\n\n###Tools:\n\n* [git-tip](https://www.npmjs.com/package/git-tip) is a console utility that makes it easy to use all of these commands. Docker container can be found [here](https://github.com/djoudi5/docker-git-tip)\n\n**P.S:** All these commands have been tested in `git version 2.7.4 (Apple Git-66)`\n\n\n\n### Branch\n\n- [Create a new branch and switch to it](#Create a new branch and switch to it)\n- [Create new branch without parent branch](#Create new branch without parent branch)\n- [Fast-switching to the previous branch](#Quick-switching-to-the-previous-branch)\n- [List of local and remote branches](#List-of-local-and-remote-branches)\n- [List of branches in the remote repository](#List-of-branches-in-the-remote-repository)\n- [Show all branches (including remote branches), as well as the last commit in them](#Show-all-branches-including-and-remote-branches-and-same-last- commit-in-them)\n- [Rename branch](#Rename-branch)\n- [Delete local branch](#Delete-local-branch)\n- [Delete branch in remote repository](#Delete branch in remote repository)\n- [Show current branch name](#Show-current-branch-name)\n- [Show all branches not merged into master](#Show-all-branches-not-merged-in-master)\n- [Show a list of branches that are already merged into master](#Show a list of branches that are already merged into master)\n- [Move feature branch to master and merge it into master](#Move-feature-branch-to-master-and-merge-it-to-master)\n- [Delete branches that are already merged to master](#Delete-branches-already-merged-to-master)\n- [Find branches that contain a commit with the given hash](#Find-branches-that-contain-tocommit-with-hash-specified)\n- [Track upstream branch](#track-upstream-branch)\n\n###Clean\n\n- [Force delete untracked files](#Force-delete-untracked-files)\n- [Force-remove untracked files and directories](#Force-delete-untracked-files-and-directories)\n- [Delete all files that are in `.gitignore`](#Delete-all-files-that-are-in-gitignore)\n- [Before deleting untracked files/directory, do a dry run to get the list of these files/directories](#before-deleting-untracked-filesdirectory-do-a-dry-run-to-get-the-list-of -these-filesdirectories)\n- [Dry run (any command that supports dry-run flag should do)](#dry-run-any-command-that-supports-dry-run-flag-should-do)\n\n###Commit\n\n- [Change last commit message](#Change-last-commit-message)\n- [Change previous commit without changing commit message](#Change-previous-commit-without-changing-message-to-commit)\n- [Fix last-commit-author-name](#Fix-name-of-last-commit-author)\n- [Reset author, after author has been changed in the global config](#reset-author-after-author-has-been-changed-in-the-global-config)\n- [Commit with only the specified files](#Commit with the specified files only)\n- [Commit bypassing pre-commit and commit-msg hooks](#Make-commit-bypass-pre-commit-and-commit-msg hooks)\n- [Mark commit as hotfix to specified commit](#Mark commit as hotfix to specified commit)\n\n###Config\n\n- [Show config and all aliases (alias)](#Show-config-and-all-alias-alias)\n- [Change local/global git config](#Change-localglobal-git-config)\n- [Change text editor](#Change-text-editor)\n- [Ignore changes to file permissions on commit](#Ignore-changes-to-file-permissions-on-commit)\n- [Make git case-sensitive](#Make-git-case-sensitive)\n- [Enable auto-correct typos](#Enable-auto-correct-typos)\n- [Disable colorColored Git output](#Disable-colored-git-output)\n- [Specific color settings](#specific-color-settings)\n- [Remove entry from global config](#Remove entry from global config)\n- [Reuse recorded resolution, record and reuse previous conflicts resolutions](#reuse-recorded-resolution-record-and-reuse-previous-conflicts-resolutions)\n- [Always perform a move instead of a merge when getting changes from a remote repository]\n- [Git command aliases](#Git command aliases)\n\n### Diff\n\n- [Show changes since last commit](#Show-changes-since-last-commit)\n- [Show all changes (for files not in the index and already there)](#Show-all-changes-for-files-not-in-index-and-already-there)\n- [Changes in files that are-in-index](#Changes-in-files-that-are-in-index)\n- [Show changes in one line](#Show-changes-in-one-line)\n- [Show list of conflicting files](#Show-list-of-conflict-files)\n- [Open all conflicting files in editor](#Open-all-conflicting-files-in-editor)\n- [List of all files that were changed in the commit](#List of all files that were changed in the commit)\n\n###Index\n\n- [Interactively adding files to the index](#Interactively adding files to the index)\n- [Add file part to index](#Add file part to index)\n- [Remove file from index](#Remove-file-from-index)\n- [Remove all files from index](#Remove-all-files-from-index)\n\n### Log\n\n- [Show logs for a specific period (from-to)](#Show-logs-for-a-specified-period-from-to)\n- [Show commits for the specified time](#Show-commits-for-the-specified-time-period)\n- [Show commit history grouped by author name](#Show commit history grouped by author name)\n- [Show commit history, excluding the commits of the specified author](#Show-history-of-commits-commits-specified-author)\n- [Show commits and changes in them for a specific file (even if it has been renamed)](#Show-commits-and-changes-in-them-for-a-specific-file-even-if-it-was-renamed)\n- [List only the root and merge commits](#list-only-the-root-and-merge-commits)\n- [Show non-pushed commits](#Show-non-pushed-commits)\n- [Show all commits since branch-master](#Show-all-commits-since-branch-from-master)\n- [Commits in branch-1 that are not in branch-2](#Commits-in-branch-1-that-are-not-in-branch-2)\n- [Show GPG signature in commit history](#Show gpg signature in commit history)\n- [Show number of rows added/deleted by user](#Show-number-of-rows-user-added/deleted)\n- [Search Commit History by Regular Expression](#Search Commit History by Regular Expression)\n- [Show all git notes](#Show-all-git-notes)\n- [Show tags (versions) tree](#Show-tags-versions-tree)\n- [Get first commit in a branch (from master)](#get-first-commit-in-a-branch-from-master)\n\n### Merge\n\n- [Merge feature branch into master, merging all commits of feature branch into one](#Merge-feature-branch-to-master-merging-all-commits-of-feature-branch into one)\n\n###Push\n\n- [Submit commits to remote repository by overwriting history (force push)](#Send-commits-to-remote-repository-overwriting-history-force-push)\n- [Submit commits to a remote repository, making sure you're not overwriting other people's commits]\n- [Auto-set-remote-for-branch-on-push](#Auto-set-remote-for-branch-on-push)\n\n###Show\n\n- [Show changes in commit](#Show-changes-in-commit)\n- [Show changes in commit (by hash)](#Show-changes-in-commit-by-hash)\n\n### Stash\n\n- [Hide current changes for watched files](#Hide-current-changes-for-watched-files)\n- [HiddenHide current changes, including untracked files](#Hide-current-changes-including-untracked-files)\n- [Hide current changes except for files in index](#Hide-current-changes-except-for-files-in-index)\n- [Hide only part of file(s)](#Hide-only-part-of-file-files)\n- [Show list of hidden changes](#Show-list-of-hidden-changes)\n- [Apply latest stashed changes and remove them from stack](#Apply-recent-hidden-changes-and-remove-them-from-stack)\n- [Apply latest stashed changes without removing them from stack](#Apply-recent-hidden-changes-without-removing-them-from-stack)\n- [Extract single file from stash](#Extract single file from stash)\n- [Clear stash](#Clear-stash)\n\n### Tags\n\n- [Create new tag](#Create-new-tag)\n- [Send tags to remote repository](#Send tags to remote repository)\n- [Remove tag in local repository](#Remove tag in local repository)\n- [Remove Tag in Remote Repository](#Remove Tag in Remote Repository)\n\n### Miscellaneous\n\n- [Everyday Git in twenty commands or so](#everyday-git-in-twenty-commands-or-so)\n- [Show helpful guides that come with Git](#show-helpful-guides-that-come-with-git)\n- [Clone separate branch](#Clone-separate-branch)\n- [Clone repository with specified number of commits](#Clone-repository-with-specified-number-of-commits)\n- [Import package to repository](#Import-package-to-repository)\n- [Alias: git undo](#alias-git-undo)\n- [Get data from remote repository and reset current branch to it](#Get data from remote repository and reset current branch to it)\n- [Prunes references to remote branches that have been deleted in the remote](#prunes-references-to-remote-branches-that-have-been-deleted-in-the-remote)\n- [Upload pull request to current branch by ID](#Upload pull request to current branch by id)\n- [Specific fetch reference](#specific-fetch-reference)\n- [List of all files till a commit](#list-of-all-files-till-a-commit)\n- [Git reset first commit](#git-reset-first-commit)\n- [Show most recent tag on current branch](#Show most recent tag on current branch)\n- [Revert: Revert a commit with a new commit](#revert-revert-a-commit-with-a-new-commit)\n- [Revert: revert merge with new commit](#revert-revert-merge-with-new-commit)\n- [Reset: Revert commits (reset to specified commit)](#reset-Revert-commits-reset-to-specified-commit)\n- [Show commit history for current branch only](#Show-history-of-commits-only-for-current-branch)\n- [Show list of remote repositories](#Show-list-of-remote-repositories)\n- [Change remote repository URL](#Change-url-remote-repository)\n- [List references in a remote repository](#list-references-in-a-remote-repository)\n- [Add remote repository](#Add-remote-repository)\n- [Autocomplete git commands in bash](#Autocomplete git commands in bash)\n- [Push commits from one branch to another using cherry-pick](#Push-commits-from-one-branch-to-another-with-cherry-pick)\n- [Undo local changes with the last content in head](#undo-local-changes-with-the-last-content-in-head)\n- [Show all tracked files](#Show-all-tracked-files)\n- [Show all untracked files](#Show all untracked files)\n- [Show all ignored files](#Show all ignored files)\n- [Create new working tree from a repository (git 2.5)](#create-new-working-tree-from-a-repository-git-25)\n- [Create new working tree from HEAD state](#create-new-working-tree-from-head-state)\n- [Don't track file (no delete)](#Don't-track-file-without-delete)\n- [Update all submodules](#Update-all-submodules)\n- [Show current-branch commits that will be merged to master](#Show-current-branch-commits-to-be-merged-to-master)\n- [Retrieve the commit hash of the initial revision](#retrieve-the-commit-hash-of-the-initial-revision)\n- [Deploying git tracked subfolder to gh-pages](#deploying-git-tracked-subfolder-to-gh-pages)\n- [Adding a project to repo using subtree](#adding-a-project-to-repo-using-subtree)\n- [Get latest changes in your repo for a linked project using subtree](#get-latest-changes-in-your-repo-for-a-linked-project-using-subtree)\n- [Export branch to file (create package)](#Export-branch-to-file-create-package)\n- [Archive master branch](#Archive-master-branch)\n- [Ignore one file on commit (e.g. Changelog)](#ignore-one-file-on-commit-eg-changelog)\n- [Hide Changes Before Moving](#Hide Changes Before Moving)\n- [Show changes using common diff tools](#show-changes-using-common-diff-tools)\n- [Don't consider changes for tracked file](#dont-consider-changes-for-tracked-file)\n- [Undo assume-unchanged](#undo-assume-unchanged)\n- [Restore deleted file](#Restore-deleted-file)\n- [Restore file to a specific commit-hash](#restore-file-to-a-specific-commit-hash)\n- [Check if the change was a part of a release](#check-if-the-change-was-a-part-of-a-release)\n- [Squash fixup commits normal commits](#squash-fixup-commits-normal-commits)\n- [Show list of ignored files](#Show-list-of-ignored-files)\n- [Ignored-files status](#Ignored-files-status)\n- [Count unpacked number of objects and their disk consumption](#count-unpacked-number-of-objects-and-their-disk-consumption)\n- [Prune all unreachable objects from the object database](#prune-all-unreachable-objects-from-the-object-database)\n- [Instantly browse your working repository in gitweb](#instantly-browse-your-working-repository-in-gitweb)\n- [Get file from other branch](#Get file from other branch)\n- [Modify commits interactively](#Modify-commits-interactively)\n- [Finding a commit with a bug using binary search](#Finding-a-commit-with-a-bug-using-binary-search)\n- [Show all local branches sorted by date modified](#Show all local branches sorted by date modified)\n- [Find lines matching the pattern (regex or string) in tracked files](#find-lines-matching-the-pattern-regex-or-string-in-tracked-files)\n- [Number of commits in branch](#Number of-commits-in-branch)\n- [Add note](#Add note)\n- [Apply commit from another repository](#apply-commit-from-another-repository)\n- [Find the common ancestor of two branches](#Find-the-common-ancestor-of-two-branches)\n- [Shows last-modified-author, time, and commit-hash for each file-line]\n- [Shows last-modified-author, time, and hash commit for the specified range of rows]\n- [Show a Git logical variable](#show-a-git-logical-variable)\n- [Preformatted patch file](#preformatted-patch-file)\n- [Show repository name](#Show repository name)\n- [Generates a summary of pending changes](#generates-a-summary-of-pending-changes)\n- [Backup Untracked Files](#Backup Untracked Files)\n\n\n\n\n\n\n\n\n\n## Branch\n\n### Create a new branch and switch to it\n```sh\ngit checkout -b \n```\n\n__Alternatives:__\n```sh\ngit branch && git checkout \n```\n\n### Create a new branch without a parent branch\n```sh\ngit checkout --orphan \n```\n\n### Fast switching to the previous branch\n```sh\ngit checkout-\n```\n\n### List local and remote branches\n```sh\ngit branch -a\n```\n\n### List of branches in the remote repository\n```sh\ngit branch -r\n```\n\n### Show all branches (including remote branches), as well as the latest commit in them\n```sh\ngit branch -vv\n```\n\n### Rename branch\n```sh\ngit branch -m \n```\n\n__Alternatiweight:__\n```sh\ngit branch -m [] \n```\n\n### Delete local branch\n```sh\ngit branch -d \n```\n\n### Delete branch in remote repository\n```sh\ngit push origin --delete \n```\n\n__Alternatives:__\n```sh\ngit push origin :\n```\n\n### Show current branch name\n```sh\ngit rev-parse --abbrev-ref HEAD\n```\n\n### Show all branches not merged into master\n```sh\ngit checkout master && git branch --no-merged\n```\n\n### Show a list of branches that are already merged into master\n```sh\ngit branch --merged master\n```\n\n### Move the feature branch to master and merge it into master\n```sh\ngit rebase master feature && git checkout master && git merge -\n```\n\n### Delete branches that are already merged into master\n```sh\ngit branch --merged master | grep -v '^\\*' | xargs -n 1 git branch -d\n```\n\n__Alternatives:__\n```sh\ngit branch --merged master | grep -v '^\\*\\| master' | xargs -n 1 git branch -d # will not delete master if master is not checked out\n```\n\n### Find branches that contain a commit with the given hash\n```sh\ngit branch -a --contains \n```\n\n__Alternatives:__\n```sh\ngit branch --contains \n```\n\n### Track upstream branch\n```sh\ngit branch -u origin/mybranch\n```\n\n##Clean\n\n### Force deletion of untracked files\n```sh\ngit clean -f\n```\n\n### Force removal of untracked files and directories\n```sh\ngit clean -f -d\n```\n\n__Alternatives:__\n```sh\ngit clean -df\n```\n\n### Delete all files that are in `.gitignore`\n```sh\ngit clean -X -f\n```\n\n### Before deleting untracked files/directory, do a dry run to get the list of these files/directories\n```sh\ngit clean -n\n```\n\n### Dry run (any command that supports dry-run flag should do)\n```sh\ngit clean -fd --dry-run\n```\n\n## commit\n\n### Change last commit message\nWhen the command is executed, the editor specified in the git settings will open. You need to change the text of the message, save the file and close the editor.\n\nThe message can also be specified directly when invoking the command using the `-m` (`--message`)\n```sh\ngit commit --amend\n\n# you can specify a message with the -m option\ngit commit --amend -m \"New message\"\n```\n\n### Change the previous commit without changing the commit message\n```sh\ngit commit --amend --no-edit\n```\n\n### Fix last commit author name\n```sh\ngit commit --amend --no-edit --author='Author Name '\n```\n\n### Reset author, after author has been changed in the global config\n```sh\ngit commit --amend --reset-author --no-edit\n```\n\n### Make a commit with only the specified files\n```sh\ngit commit --only \n```\n\n### Make a commit by bypassing the pre-commit and commit-msg hooks\n```sh\ngit commit --no-verify\n```\n\n__Alternatives:__\n```sh\ngit commit -n\n```\n\n### Mark a commit as a fix to the specified commit\n```sh\ngit commit --fixup \n```\n\n##Config\n\n### Show config and all aliases\n```sh\ngit config --list\n```\n\n### Change local/global git config\n```sh\ngit config [--global] --edit\n```\n\n### Change text editor\n```sh\ngit config --global core.editor '$EDITOR'\n```\n\n### Ignore file permission changes on commit\n```sh\ngit config core.fileMode false\n```\n\n### Make git case sensitive\n```sh\ngit config --global core.ignorecase false\n```\n\n### Enable automatic typo correction\n```sh\ngit config --global help.autocorrect 1\n```\n\n### Disable colored Git output\n```sh\ngit config --global color.ui false\n```\n\n### Specific color settings\n```sh\ngit config --global \n```\n\n### Remove entry from global config\n```sh\ngit config --global --unset \n```\n\n### Reuse recorded resolution, record and reuse previous conflicts resolutions\n```sh\ngit config --global rerere.enabled 1\n```\n\n### Always do a move instead of a merge when getting changes from a remote repository\n```sh\ngit config --global pull.rebase true\n```\n\n__Alternatives:__\n```sh\n#git < 1.7.9\ngit config --global branch.autosetuprebase always\n```\n\n### Aliases for Git commands\n```sh\ngit config --global alias. \ngit config --global alias.st status\n```\n\n## Diff\n\n### Show changes since last commit\n```sh\ngit diff\n```\n\n### Show all changes (for files that are not in the index and are already there)\n```sh\ngit diff HEAD\n```\n\n### Changes to files that are in the index\n```sh\ngit diff --cached\n```\n\n__Alternatives:__\n```sh\ngit diff --staged\n```\n\n### Show changes in one line\n```sh\ngit diff --word-diff\n```\n\n### Show list of conflicting files\n```sh\ngit diff --name-only --diff-filter=U\n```\n\n### Open all conflicting files in an editor\n```sh\ngit diff --name-only | uniq | xargs $EDITOR\n```\n\n### List all files that were changed in the commit\n```sh\ngit diff-tree --no-commit-id --name-only -r \n```\n\n##Index\n\n### Interactively adding files to the index\n```sh\ngit add -i\n```\n\n### Add part of file to index\n```sh\ngit add -p\n```\n\n### Remove file from index\n```sh\ngit reset HEAD \n```\n\n### Remove all files from the index\n```sh\ngit reset HEAD\n```\n\n## Log\n\n### Show logs for a specific period (from-to)\n```sh\ngit log --since='FEB 1 2017' --until='FEB 14 2017'\n```\n\n### Show commits for the specified time period\n```sh\ngit log --no-merges --raw --since='2 weeks ago'\n```\n\n__Alternatives:__\n```sh\ngit whatchanged --since='2 weeks ago'\n```\n\n### Show commit history grouped by author name\n```sh\ngit shortlog\n```\n\n### Show commit history, excluding commits by the specified author\n```sh\ngit log --perl-regexp --author='^((?!excluded-author-regex).*)\n\n```\n\n### Show commits and changes in commits for a specific file (even if it has been renamed)\n```sh\ngit log --follow -p -- \n```\n\n### List only the root and merge commits\n```sh\ngit log --first-parent\n```\n\n### Show non-pushed commits\n```sh\ngit log --branches --not --remotes\n```\n\n__Alternatives:__\n```sh\ngit log@{u}..\n```\n\n```sh\ngit cherry -v\n```\n\n### Show all commits since branching from master\n```sh\ngit log --no-merges --stat --reverse master..\n```\n\n### Commits in branch-1 that don't exist in branch-2\n```sh\ngit log branch-1 ^branch-2\n```\n\n### Show GPG signature in commit history\n```sh\ngit log --show signature\n```\n\n### Show number of rows added/deleted by user\n```sh\ngit log --author='Your Name Here' --pretty=tformat: --numstat | gawk '{ add += ; subs += ; loc += - } END { printf \"added lines: %s removed lines: %s total lines: %s\n\", add, subs, loc }' -\n```\n\n__Alternatives:__\n```sh\ngit log --author='Your Name Here' --pretty=tformat: --numstat | awk '{ add += ; subs += ; loc += - } END { printf \"added lines: %s, removed lines: %s, total lines: %s\n\", add, subs, loc }' - # on Mac OSX\n```\n\n### Search in commit history by regular expression\n```sh\ngit log --all --grep=''\n```\n\n### Show all notes (git notes)\n```sh\ngit log --show-notes='*'\n```\n\n### Show tag tree (versions)\n```sh\ngit log --pretty=oneline --graph --decorate --all\n```\n\n__Alternatives:__\n```sh\ngitk --all\n```\n\n### Get first commit in a branch (from master)\n```sh\ngit log master.. --oneline | tail-1\n```\n\n## Merge\n\n### Merge the feature branch into master, merging all feature branch commits into one\nThis will not create a merge commit, you will need to make it manually.\n```sh\ngit merge feature --squash\n```\n\n##Push\n\n### Push commits to remote repository by overwriting history (force push)\n```sh\ngit push --force\n```\n\n__Alternatives:__\n```sh\ngit push -f\n```\n\n### Push commits to the remote repository, making sure you don't overwrite other people's commits\n```sh\ngit push --force-with-lease \n```\n\n### Automatically set remote for branch when pushing\n\n###\n```sh\ngit config --global push.autoSetupRemote true\n```\n\n## show\n\n### Show commit changes\nYou can also use `HEAD~1`, `HEAD~2` etc. to view previous commits.\n```sh\ngit show HEAD\n```\n\n### Show commit changes (by hash)\n```sh\ngit show \n```\n\n## Stash\n\n### Hide current changes for monitored files\n```sh\ngit stash\n```\n\n__Alternatives:__\n```sh\ngit stash save\n```\n\n### Hide current changes, including untracked files\n```sh\ngit stash -u\n```\n\n__Alternatives:__\n```sh\ngit stash --include-untracked\n```\n\n### Hide current changes except for files in the index\n```sh\ngit stash --keep-index\n```\n\n### Hide only part of the file(s)\nAllows you to select the changes you want to hide\n```sh\ngit stash -p\n```\n\n### Show list of hidden changes\n```sh\ngit stash list\n```\n\n### Apply the latest stashed changes and remove them from the stack\n```sh\ngit stash pop\n```\n\n__Alternatives:__\n```sh\ngit stash apply stash@{0} && git stash drop stash@{0}\n```\n\n### Apply the latest hidden changes without removing them from the stack\n```sh\ngit stash apply \n```\n\n### Extract single file from stash\n```sh\ngit checkout -- \n```\n\n__Alternatives:__\n```sh\ngit checkout stash@{0} -- \n```\n\n### Clear stash\n```sh\ngit stash clear\n```\n\n__Alternatives:__\n```sh\ngit stash drop \n```\n\n## Tags\n\n### Create a new tag\n```sh\ngit tag \n```\n\n### Send tags to remote repository\n```sh\ngit push --tags\n```\n\n### Delete tag in local repository\n```sh\ngit tag -d \n```\n\n### Delete tag in remote repository\n```sh\ngit push origin :refs/tags/\n```\n\n__Alternatives:__\n```sh\ngit push origin :\n```\n\n```sh\ngit push -d origin \n```\n\n## Miscellaneous\n\n### Everyday Git in twenty commands or so\n```sh\ngit help daily\n```\n\n### Show helpful guides that come with Git\n```sh\ngit help -g\n```\n\n### Clone a separate branch\n```sh\ngit clone -b --single-branch https://github.com/user/repo.git\n```\n\n### Clone the repository with the specified number of commits\n```sh\ngit clone https://github.com/user/repo.git --depth 1\n```\n\n### Import paket to the repository\n```sh\ngit clone repo.bundle -b \n```\n\n### Alias: git undo\n```sh\ngit config --global alias.undo '!f() { git reset --hard $(git rev-parse --abbrev-ref HEAD)@{${1-1}}; }; f'\n```\n\n### Get the data from the remote repository and reset the state of the current branch to it\n```sh\ngit fetch --all && git reset --hard origin/master\n```\n\n### Prunes references to remote branches that have been deleted in the remote\n```sh\ngit fetch -p\n```\n\n__Alternatives:__\n```sh\ngit remote prune origin\n```\n\n### Upload a pull request to the current branch by ID\n```sh\ngit fetch origin pull//head:\n```\n\n__Alternatives:__\n```sh\ngit pull origin pull//head:\n```\n\n### Specific fetch reference\n```sh\ngit fetch origin master:refs/remotes/origin/mymaster\n```\n\n### List of all files till a commit\n```sh\ngit ls-tree --name-only -r \n```\n\n### Git reset first commit\n```sh\ngit update-ref -d HEAD\n```\n\n### Show the most recent tag on the current branch\n```sh\ngit describe --tags --abbrev=0\n```\n\n### Revert: Revert a commit with a new commit\n```sh\ngit revert \n```\n\n### Revert: unmerge with a new commit\n```sh\ngit revert -m 1 \n```\n\n### Reset: Revert commits (reset to specified commit)\n```sh\ngit reset \n```\n\n### Show commit history for current branch only\n```sh\ngit cherry -v master\n```\n\n### Show list of remote repositories\n```sh\ngit remote\n```\n\n__Alternatives:__\n```sh\ngit remote show\n```\n\n### Change remote repository URL\n```sh\ngit remote set-url origin \n```\n\n### List references in a remote repository\n```sh\ngit ls-remote git://git.kernel.org/pub/scm/git/git.git\n```\n\n### Add remote repository\n```sh\ngit remote add \n```\n\n### Autocomplete Git commands in bash\n```sh\ncurl http://git.io/vfhol > ~/.git-completion.bash && echo '[ -f ~/.git-completion.bash ] && . ~/.git-completion.bash' >> ~/.bashrc\n```\n\n### Push commits from one branch to another using cherry-pick\n```sh\ngitcheckout && git cherry-pick \n```\n\n### Undo local changes with the last content in head\n```sh\ngit checkout -- \n```\n\n### Show all tracked files\n```sh\ngit ls-files -t\n```\n\n### Show all untracked files\n```sh\ngit ls-files --others\n```\n\n### Show all ignored files\n```sh\ngit ls-files --others -i --exclude-standard\n```\n\n### Create new working tree from a repository (git 2.5)\n```sh\ngit worktree add -b \n```\n\n### Create new working tree from HEAD state\n```sh\ngit worktree add --detach HEAD\n```\n\n### Don't track the file (no deletion)\nRemoves a file from git while keeping a local copy of it\n```sh\ngit rm --cached \n```\n\n__Alternatives:__\n```sh\ngit rm --cached -r \n```\n\n### Update all submodules\n```sh\ngit submodule foreach git pull\n```\n\n__Alternatives:__\n```sh\ngit submodule update --init --recursive\n```\n\n```sh\ngit submodule update --remote\n```\n\n### Show current branch commits that will be merged into master\n```sh\ngit cherry -v master\n```\n\n__Alternatives:__\n```sh\ngit cherry -v master \n```\n\n### Retrieve the commit hash of the initial revision\n```sh\n git rev-list --reverse HEAD | head-1\n```\n\n__Alternatives:__\n```sh\ngit rev-list --max-parents=0 HEAD\n```\n\n```sh\ngit log --pretty=oneline | tail -1 | cut -c 1-40\n```\n\n```sh\ngit log --pretty=oneline --reverse | head -1 | cut -c 1-40\n```\n\n### Deploying git tracked subfolder to gh-pages\n```sh\ngit subtree push --prefix subfolder_name origin gh-pages\n```\n\n### Adding a project to repo using subtree\n```sh\ngit subtree add --prefix=/ --squash git@github.com:/.git master\n```\n\n### Get latest changes in your repo for a linked project using subtree\n```sh\ngit subtree pull --prefix=/ --squash git@github.com:/.git master\n```\n\n### Export branch to file (create package)\n```sh\ngit bundle create \n```\n\n### Archive branch\u0443 master\n```sh\ngit archive master --format=zip --output=master.zip\n```\n\n### Ignore one file on commit (e.g. Changelog)\n```sh\ngit update-index --assume-unchanged Changelog; git commit -a; git update-index --no-assume-unchanged Changelog\n```\n\n### \u0421\u043f\u0440\u044f\u0442\u0430\u0442\u044c \u0438\u0437\u043c\u0435\u043d\u0435\u043d\u0438\u044f \u043f\u0435\u0440\u0435\u0434 \u0432\u044b\u043f\u043e\u043b\u043d\u0435\u043d\u0438\u0435\u043c \u043f\u0435\u0440\u0435\u043c\u0435\u0449\u0435\u043d\u0438\u044f\n```sh\ngit rebase --autostash\n```\n\n### Show changes using common diff tools\n```sh\ngit difftool -t \n```\n\n### Don\u2019t consider changes for tracked file\n```sh\ngit update-index --assume-unchanged \n```\n\n### Undo assume-unchanged\n```sh\ngit update-index --no-assume-unchanged \n```\n\n### \u0412\u043e\u0441\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u0442\u044c \u0443\u0434\u0430\u043b\u0435\u043d\u043d\u044b\u0439 \u0444\u0430\u0439\u043b\n```sh\ngit checkout ^ -- \n```\n\n### Restore file to a specific commit-hash\n```sh\ngit checkout -- \n```\n\n### Check if the change was a part of a release\n```sh\ngit name-rev --name-only \n```\n\n### Squash fixup commits normal commits\n```sh\ngit rebase -i --autosquash\n```\n\n### \u041f\u043e\u043a\u0430\u0437\u0430\u0442\u044c \u0441\u043f\u0438\u0441\u043e\u043a \u0438\u0433\u043d\u043e\u0440\u0438\u0440\u0443\u0435\u043c\u044b\u0445 \u0444\u0430\u0439\u043b\u043e\u0432\n```sh\ngit check-ignore *\n```\n\n### \u0421\u0442\u0430\u0442\u0443\u0441 \u0438\u0433\u043d\u043e\u0440\u0438\u0440\u0443\u0435\u043c\u044b\u0445 \u0444\u0430\u0439\u043b\u043e\u0432\n```sh\ngit status --ignored\n```\n\n### Count unpacked number of objects and their disk consumption\n```sh\ngit count-objects --human-readable\n```\n\n### Prune all unreachable objects from the object database\n```sh\ngit gc --prune=now --aggressive\n```\n\n### Instantly browse your working repository in gitweb\n```sh\ngit instaweb [--local] [--httpd=] [--port=] [--browser=]\n```\n\n### \u041f\u043e\u043b\u0443\u0447\u0438\u0442\u044c \u0444\u0430\u0439\u043b \u0438\u0437 \u0434\u0440\u0443\u0433\u043e\u0439 \u0432\u0435\u0442\u043a\u0438\n```sh\ngit show :\n```\n\n### \u0418\u0437\u043c\u0435\u043d\u0438\u0442\u044c \u043a\u043e\u043c\u043c\u0438\u0442\u044b \u0432 \u0438\u043d\u0442\u0435\u0440\u0430\u043a\u0442\u0438\u0432\u043d\u043e\u043c \u0440\u0435\u0436\u0438\u043c\u0435\n```sh\ngit rebase --interactive HEAD~2\n```\n\n### \u041f\u043e\u0438\u0441\u043a \u043a\u043e\u043c\u043c\u0438\u0442\u0430 \u0441 \u0431\u0430\u0433\u043e\u043c \u043f\u0440\u0438 \u043f\u043e\u043c\u043e\u0449\u0438 \u0431\u0438\u043d\u0430\u0440\u043d\u043e\u0433\u043e \u043f\u043e\u0438\u0441\u043a\u0430\n```sh\ngit bisect start # Search start \ngit bisect bad # Set point to bad commit \ngit bisect good v2.6.13-rc2 # Set point to good commit|tag \ngit bisect bad # Say current state is bad \ngit bisect good # Say current state is good \ngit bisect reset # Finish search\n\n```\n\n### Show all local branches sorted by modified date\n```sh\ngit for-each-ref --sort=-committerdate --format='%(refname:short)' refs/heads/\n```\n\n### Find lines matching the pattern (regex or string) in tracked files\n```sh\ngit grep --heading --line-number 'foo bar'\n```\n\n### Number of commits in a branch\n```sh\ngit rev-list --count \n```\n\n### Add note\n```sh\ngit notes add -m 'Note on the previous commit....'\n```\n\n### Apply commit from another repository\n```sh\ngit --git-dir=/.git format-patch -k -1 --stdout | git am -3 -k\n```\n\n### Find the common ancestor of two branches\n```sh\ndiff -u <(git rev-list --first-parent BranchA) <(git rev-list --first-parent BranchB) | sed -ne 's/^ //p' | head-1\n```\n\n### Shows the author, time, and commit hash of the last modification for each line of the file\nYou can also run the command with the `-s` flag in order to show the author and time of the commit\n```sh\ngit blame \n```\n\n### Shows the author, time, and hash commit of the last modification for the specified range of rows\n```sh\ngit blame