url
stringlengths 13
4.35k
| tag
stringclasses 1
value | text
stringlengths 109
628k
| file_path
stringlengths 109
155
| dump
stringclasses 96
values | file_size_in_byte
int64 112
630k
| line_count
int64 1
3.76k
|
---|---|---|---|---|---|---|
https://worldbuilding.stackexchange.com/questions/90811/would-self-fitting-clothes-be-a-great-innovation-for-a-future-civilization-or-wo | code | Self-fitting clothes sound cool but I can't help but feel that they wouldn't last long and would soon become impractical and just a passing fad. What are the benefits and drawbacks to self fitting clothes?
closed as primarily opinion-based by L.Dutch♦, Separatrix, adaliabooks, sphennings, Philipp Sep 3 '17 at 23:18
Many good questions generate some degree of opinion based on expert experience, but answers to this question will tend to be almost entirely based on opinions, rather than facts, references, or specific expertise. If this question can be reworded to fit the rules in the help center, please edit the question.
Presuming the self-fitting feature of clothing was inexpensive, actually worked, and eventually the cost of such a feature was not noticeable to consumers; they would become ubiquitous. IRL many wealthy people that can afford tailors do use them, whether people "notice" their clothes are fitted or not.
Fitting can make a garment you like available, particularly if the only available pre-made sizes are all either too baggy in some places or too tight or stretched in another.
That means self-fitting is a valuable feature for both producers and consumers: It increases the market size for producers, and increases the range of styles for consumers.
There is no reason to think it would be any less popular than other similar innovations; like color-fastness, washability, non-shrinking, stain-resistance, and so on. Even if it were noticeably more expensive, it might attract an intermediate market between the lower middle class and those that can routinely afford more expensive custom tailoring. | s3://commoncrawl/crawl-data/CC-MAIN-2019-39/segments/1568514574532.44/warc/CC-MAIN-20190921145904-20190921171904-00187.warc.gz | CC-MAIN-2019-39 | 1,630 | 7 |
https://community.rmit.edu.au/t5/General-Discussion/Accepting-Solutions-in-the-myCommunity-Mobile-Site/m-p/8167 | code | Accepting Solutions in the myCommunity (Mobile Site)
One of the key purposes of the myCommunity is the sharing of knowledge and experiences amongst
the community at large.
If you've posed a question on the forums, it's very likely someone may provide a solution to this question
that works for you, in which case you can mark the response as a Solution.
Marking responses as a Solution is very important as it helps others who are accessing the forums to
find potential answers more readily.
If you've had a question successfully answered, please take a moment to accept it as the Solution for the
benefit of other myCommunity members (and yourself, should you ever forget).
1. First you make a post with your question:
2. Next, other members of the myCommunity will reply. If you find a reply that answers your question, tap the
3. Once a solution has been accepted it will be highlighted in green for all other members to see. The initial
post in the thread will also display the "Go To Solution" link, which will take other members directly to the | s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710801.42/warc/CC-MAIN-20221201053355-20221201083355-00256.warc.gz | CC-MAIN-2022-49 | 1,050 | 13 |
http://aok.heavengames.com/blacksmith/showfile.php?fileid=5544 | code | I found this SLP list created by someone named ELITE NIGHT. It was incomplete because SLP Studio could not open hundreds of graphics. So I completed this list. I didn't find it here on this site and decided to post it. If SLP list is available here then I suppose mine is a piece of junk. | s3://commoncrawl/crawl-data/CC-MAIN-2021-25/segments/1623488519183.85/warc/CC-MAIN-20210622155328-20210622185328-00605.warc.gz | CC-MAIN-2021-25 | 288 | 1 |
http://writers.stackexchange.com/questions/tagged/outline?sort=unanswered&pagesize=15 | code | to customize your list.
more stack exchange communities
Start here for a quick overview of the site
Detailed answers to any questions you might have
Discuss the workings and policies of this site
tag has no wiki summary.
Using the Minto Pyramid
Using the Minto Pyramid, should I order the paragraphs of my paper by going through the rows of the pyramid like this? Thesis > Why > How > Where > A > B > C > D > E Or should I start at the top ...
Jul 2 '13 at 21:48
recently active outline questions feed
unanswered question tagged
Hot Network Questions
What's the difference in meaning between "evidence" and "proof"?
Can you cancel out a term if equal to zero?
Can excess oil make french fries soggy?
How do I declare an extended height Toolbar/Action Bar on Android Lollipop?
What is the difference between "key length" and "bit strength"?
Making Future Posts Runnable Online with Stack Snippets
Can Poodle be fixed in SSLv3 or will it go the way of TLS compression?
Quicksort implementation seems too slow with large data
Would we see cannons in a magic-using society?
what does the '-1' superscript mean in units?
How can native English speakers read an unknown word correctly?
When do you ask a professor if a visitor can sit in on class?
Position operator in QFT
When to use Con or Reclass
Managing Windows Powershell from Linux terminal
6 Tries to Guess a Number Between 1-100
Is the price per person affected by searching a flight for more than one person?
Why didn't Walter White consume his own product?
Throwing more than once per turn
How to exclude lines that has given columns?
Boiling Chicken Breast (or any meat) before cooking to cook evenly
Word/phrase for "the one that brings bad luck" (e.g. to a group)
Can a satellite orbit Earth so that it always has the Moon in line of sight?
Contacted by a Company's Client to do a project - should I inform my company?
more hot questions
Life / Arts
Culture / Recreation
TeX - LaTeX
Unix & Linux
Ask Different (Apple)
Geographic Information Systems
Science Fiction & Fantasy
Seasoned Advice (cooking)
Personal Finance & Money
English Language & Usage
Mi Yodeya (Judaism)
Cross Validated (stats)
Theoretical Computer Science
Meta Stack Exchange
Stack Overflow Careers
site design / logo © 2014 stack exchange inc; user contributions licensed under
cc by-sa 3.0 | s3://commoncrawl/crawl-data/CC-MAIN-2014-42/segments/1413507444829.13/warc/CC-MAIN-20141017005724-00302-ip-10-16-133-185.ec2.internal.warc.gz | CC-MAIN-2014-42 | 2,318 | 54 |
https://admhelp.microfocus.com/accurev/en/latest/online/Content/AccuSync-Manage/Copy%20Configuration%20Dialog%20Box.htm | code | Copy Configuration Dialog Box
You use the Copy Configuration dialog box to create a new configuration based on an existing configuration. Copying a configuration is often the easiest way to create a new configuration.
|New configuration name<![CDATA[ ]]>||The name you want to give the configuration you are creating.|
See Copying an AccuSync Configuration and Creating a New Configuration. | s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296943698.79/warc/CC-MAIN-20230321131205-20230321161205-00727.warc.gz | CC-MAIN-2023-14 | 390 | 4 |
https://docs.manywho.com/list-of-boomi-flow-service-urls/ | code | The Service element in Flow lets us connect our apps to other applications, service providers, or databases. You can also use services to extend the functionalities of a flow (say, to delay the flow, or schedule a task) and for end-user authentication.
Here is a list of services Flow currently supports, and the respective URLs.
Amazon Lex lets you build conversational interfaces into apps with voice and text.
Service URL: https://services.manywho.com/api/aws/lex/1
Convert text in your app to speech with Amazon Polly.
Service URL: https://services-staging.manywho.com/api/aws/polly/1
You can add intelligent image and video analysis to your Flow apps with Amazon Rekognition.
Service URL: https://services-staging.manywho.com/api/aws/rekognition/1
Amazon Simple Storage Service (S3) is an object storage service, that you can use as the backend database for your Flow app.
Service URL: https://services-staging.manywho.com/api/aws/s3/1
Box is a file sharing, storage, and collaboration service for enterprises. You can integrate Box with Boomi Flow to modify, share, or edit documents and folders stored in Box in an app. You can also use the integration to trigger an action in Boomi Flow if a document or folder is edited, downloaded, or uploaded in Box.
Service URL: https://services-staging.manywho.com/api/box/3
You can use the Flow Email service to build an app that can send emails. The email service needs to be configured with your own email provider in tenants. The service works with major email providers like Gmail or Microsoft Exchange.
Service URL: https://services.manywho.com/api/email/1
Azure is a suite of enterprise cloud offerings from the Microsoft stable. You can use the Flow Azure service to restrict your apps to specific groups, to specific users, or to specific users within a specific group.
Service URL: https://services.manywho.com/api/azure/1
You can use Okta identity and access management for authenticating your app users.
Service URL: https://services.manywho.com/api/okta/1
You can use OneLogin identity and access management for authenticating your app users.
Service URL: https://services-staging.manywho.com/api/onelogin/1
Boomi Flow lets you populate a PDF form, using a PDF service integration.
Service URL: https://services-staging.manywho.com/api/pdf/1
Service URL: https://salesforce.manywho.com/plugins/api/salesforce/1
SharePoint is a collaboration platform from the Microsoft stable. You can build apps with Boomi Flow that work seamlessly with SharePoint.
Service URL: https://services.manywho.com/api/sharepoint/1
The Boomi Flow SQL service lets you read from and write to an external database. Supported databases include MySQL 5.1+, PostgreSQL 8.4+. and Sql Server 2008+.
Service URL: https://services.manywho.com/api/sql/2
The Timer service is a Boomi Flow plug-in that lets you pause your flow and resume again, from within the flow itself. The service is particularly useful for time-based escalations of a process.
Service URL: https://services.manywho.com/api/timer/1
Service URL: https://services.manywho.com/api/twilio/2
You can use the Twitter service to send tweets from within apps you build with Flow.
Service URL: https://services-staging.manywho.com/api/twitter/1 | s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178351374.10/warc/CC-MAIN-20210225153633-20210225183633-00266.warc.gz | CC-MAIN-2021-10 | 3,233 | 32 |
http://ptc-computer.com.kh.outerstats.com/ | code | PTC Computer - The Biggest Technology Store in CambodiaWelcome to Cambodia's Largest Technology Store - PTC Computer. In our shops we have the widest selection of computer, laptops, tablets, servers, and more.
Ptc-computer.com.kh was created on the unknown, domain is hosted in ip: 220.127.116.11, assigned to range IP: 18.104.22.168 - 22.214.171.124, and owner of this ips: MBIT-KH Maximum Business Information Technology. Our algorithm estimates Ptc-computer.com.kh worth to be about $7,879 and estimates that it gets about 1,972 visits per day. Ptc-computer.com.kh is located in Cambodia. Ptc-computer.com.kh using Apache server and powered by unknown.
Hosted in: Cambodia
Host IP: 126.96.36.199
ICANN Registrar: unavailable
Domain Archive: ptc-computer.com.kh in the past
Alexa Rank: #507173
Google Page Rank: 0
Server DNS A: 188.8.131.52
Server DNS NS: ns2.hosting01.flash-it.biz ns1.hosting01.flash-it.biz
Server Name: unavailable
Server Type: Apache
Server Side Language: unavailable
|Header Key||Header Value|
|Date||Tue, 15 Nov 2016 00:39:23 GMT|
|Expires||Thu, 19 Nov 1981 08:52:00 GMT|
|Cache-Control||no-store, no-cache, must-revalidate|
We believe that every website pwner is able to earn money from his website.
Our estimations point that your Website Worth is $7,878.55, Your Daily Visitors could be in the area of 1972 per day and your estimated Daily Revenues could be around $5.92.
Server Country Code: KH
Server Country Name: Cambodia
Server City Name: Phnom Penh
Server Region Name: 11
Server Zip Code:
Server Latitude: 11.5625
Server Longitude: 104.91600036621
tc-computer.com.kh, ltc-computer.com.kh, xtc-computer.com.kh, pic-computer.com.kh, ptd-computer.com.kh, ptk-computer.com.kh, ptl-computer.com.kh, ptc-zomputer.com.kh, ptc-cemputer.com.kh, ptc-csmputer.com.kh, ptc-cvmputer.com.kh, ptc-cwmputer.com.kh, ptc-comouter.com.kh, ptc-comtuter.com.kh, ptc-comptter.com.kh, ptc-compucer.com.kh, ptc-compuver.com.kh, ptc-computel.com.kh, ptc-computer.iom.kh, ptc-computer.uom.kh, ptc-computer.ccm.kh, ptc-computer.ctm.kh, ptc-computer.comikh, ptc-computer.com.jh, ptc-computer.com.mh, ptc-computer.com.kt, nptc-computer.com.kh, optc-computer.com.kh, ptsc-computer.com.kh, ptxc-computer.com.kh, ptcu-computer.com.kh, ptc-icomputer.com.kh, ptc-ucomputer.com.kh, ptc-cofmputer.com.kh, ptc-comlputer.com.kh, ptc-compjuter.com.kh, ptc-computier.com.kh, ptc-computerm.com.kh, ptc-computerq.com.kh, ptc-computer.vcom.kh, ptc-computer.csom.kh, ptc-computer.cotm.kh, ptc-computer.cozm.kh, ptc-computer.com.lkh, ptc-computer.com.skh, ptc-computer.com.ukh, ptc-computer.com.ksh, ptc-computer.com.khj, ptc-computer.com.khs, ptc-computer.com.kht | s3://commoncrawl/crawl-data/CC-MAIN-2016-50/segments/1480698542714.38/warc/CC-MAIN-20161202170902-00296-ip-10-31-129-80.ec2.internal.warc.gz | CC-MAIN-2016-50 | 2,653 | 27 |
https://www.fi.freelancer.com/projects/data-entry/data-organization-28277085/?ngsw-bypass=&w=f | code | Töitä ei löytynyt
Pahoittelut, emme löytäneet etsimääsi työtä.
Löydä viimeisimmät työt täältä:
Necesito desarrollar un Sitio web que que permita a los visitantes separar reuniones de 15 ó 30 minutos, de forma gratuita, todo organizado con Google Calendar y Google Meet, el formulario de contacto debe alimentar una hoja de calculo de Google, y enviar enviar la confirmación de recepción de su mensaje a su correo electrónico usando una cuenta de Google Workspace. Par...
Training and certification organization
For Amazon Macie - just want my first job - I setup an S3 bucket with a DL name, SSI #, personal name and birthday and none of this information is getting flagged? Task: create a simple example Python to extract the data from the S3 bucket and so they findings are in Macie Output? Will pay quick and leave 5 stars. Please give your best possible for your bid ? Please note there is hope w...
* Only Professional APP development Business to bid with experience of Card linked offer APP require to build website and OS and Android APP - Bidder shall be business not individual - shall have already experience with card linked offer APP - provide all requirements to build the APP - previous reference work
Looking for someone to work a 2-5 hours a month to create fresh Instagram content in line with the company ethos and grow the account @printstinctuk
Hello Developer i am looking for a developer who is already developed a Bus Ticket Booking app and web panel and experience in reservation system. not interested to develop this from start, only developer who did this project before can contact me.
I am looking for Excel Expert. If you are expert on Microsoft Excel , Word, PowerPoint, contact me. You must know English.
This is 100% Remote contract Job Need a candidate to work 8 hours a day from Monday to Friday Candidate must be in India, This a contract Job Ideal should have 3-5 years of experience working on HCM Cloud implementation projects Must have good experience in developing Fast formulas Should have developed integrations using HDL (for inbound) and BIP & HCM Extracts (for outbound) Should have de...
I have Postgres container running with following deployment. [kirjaudu nähdäksesi URL:n] Want to upgrade it with postgres:12 Let me know your process to do this in-place or partial automated way.
Hello, I am looking for native English (UK) and Spanish proofreader for proofreading around 6k words. Please bid native on this project thank you | s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178362899.14/warc/CC-MAIN-20210301182445-20210301212445-00039.warc.gz | CC-MAIN-2021-10 | 2,498 | 13 |
https://eleganthack.com/simple-answers-for-simple-minds-redux/ | code | He’s at it again: Jakob starts with an outrageous statement, follows up with some uncited statistics, throws in a bad and excessive metaphor (bake your own bricks indeed!), moves to a left-handed pitch of his research product, and then shows that he doesn’t get out much (heard of Bigstep, J?) and finishes with a conclusion built on a whole lotta nothing. Perhaps the increased publishing schedule is getting to him, but this column needs to go back to the drawing board. Or if Mr. Neilsen wants to throw out his theories half-baked, he should get a blog.
Compare it to this small gem where Mr. Neilsen puts his finger on a key problem… not sexy, but needed.
Is there no middle ground? | s3://commoncrawl/crawl-data/CC-MAIN-2023-50/segments/1700679100535.26/warc/CC-MAIN-20231204214708-20231205004708-00344.warc.gz | CC-MAIN-2023-50 | 692 | 3 |
https://knowledgebase.cobweb.com/help/csptransfer | code | If you have an existing Office 365 service that you want to bring over to Cobweb, you will need to add us as a Partner to your tenant.
Accepting this agreement does not change your existing subscriptions with Microsoft or other Office 365 Partners.
You must accept this agreement whilst logged in as a Global Administrator of your Office 365 service.
Accept the Partner Relationship
Once accepted, you will be shown your list of partners, which will confirm Cobweb Solutions Ltd has been added.
This relationship provides us with administrative access over your tenant, equivalent to a Global Administrator. This can be removed by clicking into the Partner entry and selecting Remove delegated permissions. However, removing our delegated permissions will disable some features of Cobweb CORE to be able to manage your account and may prevent us from being able to provide support for your service.
NOTE: If you are only shown the Business Store homepage after opening our relationship agreement link, the account you are signed in with does not have permissions to accept the agreement. Please sign-out and log in as a Global Administrator account before attempting to accept the agreement again. | s3://commoncrawl/crawl-data/CC-MAIN-2022-21/segments/1652662539131.21/warc/CC-MAIN-20220521143241-20220521173241-00684.warc.gz | CC-MAIN-2022-21 | 1,197 | 7 |
https://blender.stackexchange.com/questions/253226/is-there-a-way-to-get-height-info-of-any-geometry-in-geometry-node-editor | code | Is there a way to get height info of any geometry in geometry node editor?
For example, here I am using a 'Object Info' node to get a geometry. I want to get the height of that object.
Yes. Probably two most reasonable options are:
Below I use both techniques to control cube dimensions based on the targeted object's height:
I apologize for the necropost, but I modified Markus von Broady's bounding box solution because for some reason scale didn't apply to the dimensions when I tried this. My nodes for the bounding box method will get the X, Y, and Z scales of the geometry. The separate XYZ node at the end isn't required, it'll work as just the vector; I just put it there because it suited my needs for one of my projects. | s3://commoncrawl/crawl-data/CC-MAIN-2024-10/segments/1707947474660.32/warc/CC-MAIN-20240226130305-20240226160305-00143.warc.gz | CC-MAIN-2024-10 | 730 | 5 |
http://www.actor-atlas.info/lgu:syrian-arab-republic | code | The #WWlgu tag of Syrian Arab Republic - #SYlgu in Asia is #SYlgu.
Check #WWlgu hashtags for a brief explanation on this kind of hashtag.
Check Your community leading sustainable development? #WWlgu for #LocalizingSDGs for the #WWlgu hashtags of local government units in other countries.
|muhafazah (محافظة)||city (capital of muhafazah (governorate))||#SYlgu||Development data ( Knoema )|
|الحسكة||Al Hasakah (al Hasakah)||#SY01||Data atlas|
|اللاذقية||Al Ladhiqiyah (al Ladhiqiyah)||#SY02||Data atlas|
|القنيطرة||Al Qunaytirah (al Qunaytirah)||#SY03||Data atlas|
|الرقة||Ar Raqqah (ar Raqqah)||#SY04||Data atlas|
|السويداء||As Suwayda' (as Suwayda')||#SY05||Data atlas|
|درعا||Dar`a (Dar`a)||#SY06||Data atlas|
|دير الزور||Dayr az Zawr (Dayr az Zawr)||#SY07||Data atlas|
|مدينة دمشق||Dimashq||#SY08||Data atlas|
|حلب||Halab (Halab)||#SY09||Data atlas|
|حماة||Hamah (Hamah)||#SY10||Data atlas|
|حمص||Hims (Hims)||#SY11||Data atlas|
|ادلب||Idlib (Idlib)||#SY12||Data atlas|
|ريف دمشق||Dimashq (Rif Dimashq)||#SY13||Data atlas|
|طرطوس||Tartus (Tartus)||#SY14||Data atlas|
|national #tags||this page|
|#SYlgu tags||combine the ISO country code SY with the numerical part of the ADM1 Code for the muhafazah (governorate)|
|construct more tags with the extended divisioning geocodes (Geohive)|
Note: As there is no domestically defined geocode for each muhafazah (governorate), we use the numerical part of the ADM1 code1. Obtain that code from the ADM1 Code Search module by entering Syrian Arab Republic's GEC country code: SY2.
- For cities' tags, check Syrian Arab Republic (Geohive).
Sustainable development in Syrian Arab Republic - #SYlgu?
Curated open data for Syrian Arab Republic - #SYlgu
In the media
With a hashtag per city or local government unit, anyone can tag content about the city or local government unit, and share it via social media as explained at #tags in support of easy information retrieval (video on YouTube).
If you publish a website or blog, you can also embed the Twitter timeline of a hashtag as explained at How to embed a #tag timeline in your website of blog? (Video on YouTube). There is a Twitter timeline embedded at #udhr19 - Freedom of opinion & expression; to seek & receive information.
About Syrian Arab Republic
Syrian Arab Republic: A Country Study (2001, ed. by Peter R. Blood, on-line version) published by the Federal Research Division of the US Library of Congress. The study offers a comprehensive description and analysis of the country's historical setting, geography, society, economy, political system, and foreign policy. | s3://commoncrawl/crawl-data/CC-MAIN-2017-17/segments/1492917121865.67/warc/CC-MAIN-20170423031201-00621-ip-10-145-167-34.ec2.internal.warc.gz | CC-MAIN-2017-17 | 2,661 | 30 |
http://www.gopherforum.com/showthread.php?t=12437 | code | Just Wanted to give Steve a testimonial on his gopher software. this software has everything you need to keep things in order.I found it great cause i can make jobs for x amount of days and not just by weekly are weekly. Track expenses and other neat things. Using the other software i had i was lost and unorganized but with this software it is a win win.Any one have any second try the trial and you will be buying this.
Things i might suggest on adding to this to make it better.If it could be made as a web app so that i can get to it on the web and not just from the computer its on. Like integrate it into ones main site so you can add customers and do other things but if im on the road i can just log in and keep track of things.Also it makes backing up things easier doing it this way just my two cents. | s3://commoncrawl/crawl-data/CC-MAIN-2016-44/segments/1476988720000.45/warc/CC-MAIN-20161020183840-00193-ip-10-171-6-4.ec2.internal.warc.gz | CC-MAIN-2016-44 | 812 | 2 |
https://supaflyart.me/blog/category/Illustration | code | My life is a lot of work lately, so I have been slacking putting together this site, which feels like a waste of money, but being busy isn't necessarily a bad thing.
I've been working my Normal Job For Square (TM) and working on two projects that I am being paid to do, so that's cool. Brandon of Waterlogged visited Philly a couple months ago and we had a chance to hang out for literally the first time ever, and it was a lot of fun. Other than that it's been going from the NJFS, to my home where I work on the projects, to my bed where I sleep, and then back to NJFS, from where the cycle then repeats over and over and over. So, the holidays have been a nice break from the constant grind.
I can't go into detail about the projects or when they're being released, but I can tell you that they have been keeping me very busy. They're a book and a comic, and work on the comic project has been going steadily, so I'm hoping to have the first part (of 6) finished before the end of the year, and the book project should be done in December.
Here's hoping to get the site to a place where I feel more comfortable giving people the URL, but in the meantime at least I am working, so I guess the site hasn't quite been a priority. I'm also trying to get some fun personal projects done as well so I don't go crazy, so I'm posting up a "doodle" section onto the Illustration page, and when I can get some cool doodles done more frequently I'll do blog posts of them. Also, once I feel less overwhelmed by everything, I'll try and use this blog section more frequently as well.
Here's to a busy next several months! | s3://commoncrawl/crawl-data/CC-MAIN-2019-39/segments/1568514573759.32/warc/CC-MAIN-20190919224954-20190920010954-00528.warc.gz | CC-MAIN-2019-39 | 1,612 | 5 |
https://github.com/samuel/satel | code | This repository has been archived by the owner. It is now read-only.
Join GitHub today
GitHub is home to over 20 million developers working together to host and review code, manage projects, and build software together.
Fetching latest commit…
Cannot retrieve the latest commit at this time.
I wrote 'satel' the IRC bot around 1998 for the channel #!/bin/csh on EFnet. Recently, I found the code lying around and decided to release it for persona/historical reasons. P.S. There is nothing good about this code. | s3://commoncrawl/crawl-data/CC-MAIN-2017-47/segments/1510934806660.82/warc/CC-MAIN-20171122194844-20171122214844-00676.warc.gz | CC-MAIN-2017-47 | 512 | 6 |
https://forum.disroot.org/t/epub-js-based-ebook-reader/5840 | code | I would like an epub reader to use my account more optimally.
There used to be one on nextcloud called “reader” but that appears to be un-maintained and does not work beyond NC v12.
But recently someone created: https://apps.nextcloud.com/apps/files_ebookreader
Available from the base of https://github.com/futurepress/epub.js/
Could disroot offer this module? Licensing appears to be OK. It would be like offering the pdf-viewer for epubs. | s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764499816.79/warc/CC-MAIN-20230130101912-20230130131912-00594.warc.gz | CC-MAIN-2023-06 | 445 | 5 |
http://der-lack-docktor.de/bannerli/freebooks/download-the-spherical-deformation-model-2001.php | code | 32; 1 download Das Problem time;( 0 person you! 32; 1 DOWNLOAD THE CAMBRIDGE COMPANION TO type;( 0 store you! 32; 1 SIMILAR WEB-SITE handbook;( 0 characteristics, mm, and showing? 39; buying the MIT instrumentation to CS with Python, and I went no legend. 32; 1 download tropes, universals and the philosophy of mind: essays at the boundary of ontology and philosophical inch;( 0 calibrations! 32; 1 DOWNLOAD TRAVELS IN A GAY NATION: PORTRAITS OF LGBTQ AMERICANS 2010 set;( 0 blog you! 32; 1 download How the Mind Works 1999 marketing;( 0 blog you apiece then. The over here has to inveigle paper-based. 32; 1 download American Liberalism: An Interpretation for Our Time (H. Eugene and Lillian Youngs Lehman Series) 2007 mortal;( 0 addiction you back generally!When they 'm, why are they officially intersect to rule download the spherical deformation model 2001 systems large as categorization, Python, and object? And why are some good reloads appear? beverage goal and thread blockchain look n't repeal in a checkout. Erst, they ask target in low online and simple programs, in which interactions and tactics with free pattern transfer within exercising models as they want 2017Howdy Conditions. The download of these requirements is what this Report patients funding, and the access in which these developers suffer loss, the stock poetry. The instrument of kidneys to prey and their care to become and have to Expend then many levels have what shape for hull. Also, who is, who needs prevented, and what views like proportionality to the income & be the objective and place of symptoms and, much, their " on separation media. | s3://commoncrawl/crawl-data/CC-MAIN-2020-50/segments/1606141737946.86/warc/CC-MAIN-20201204131750-20201204161750-00613.warc.gz | CC-MAIN-2020-50 | 1,630 | 1 |
https://lists.opensuse.org/archives/list/[email protected]/thread/K6O73JZKOQAFS4QLJGEVYU7EE74HZIYC/ | code | We had meeting last night. Below are the points from the meeting. I'll be approving the talks soon and sending a drop box (nextcloud) link for people to upload their videos. We will have another meeting for the video team in Friday.
#info The CfP ends today May 4
#info Registration 105, Submissions 47, 28 hours, 3 withdrawals
#topic Recording and/or Live Stream Team
#info Had video team meeting on April 30 at 18:00 UTC in meet.opensuse.org/bar
#info Next meeting is May 7 at 18:00 UTC in meet.opensuse.org/bar
#info Datto and AlmaLinux would like to sponsor
#action ddemaio to explore options with AlmaLinux and finalize Datto sponsorship
#info Link is ready and will send out to registered participants to get their t-shirt
#info Hold off on sending the link until May 11
#info Company sending t-shirts use UPS, which has some delivery difficulties for some countries.
#info Only one t-shirt per attendee
#info Keynote speaker 1 Sheng Liang, SUSE E&I
#info Keynote speaker 2 Alex Lee, CEO of shells.com
#info Collaboration panel with speakers from different projects (NG, MM, BC, OK, AS, DC)
Waiting on AW to reply
Recording can be done in OBS or other tools. If someone is good with OBS (broadcast) and wants to make a quick tutorial for the community on how to use it, please reply.
Info on where to send the recordings will be sent out to speakers ASAP
Recording should be sent early and quality should be good
If you have never done a recording with OBS or other tool, send a sample to [email protected] to receive feedback.
Pointers for people doing their video:
Make sure that the audio and ambient conditions are the same
Provide defaults on how to record a video with OBS. Have them send their recording to get feedback.
Assemble best practices videos (i.e. - speak as though you are talking to an audience or someone in the room, put something above the camera so your attention is on the camera.)
We will use the Big Blue Button inside venueless for Q&A Session
Stage 1 Q&A
Stage 2 Q&A
ddemaio added a sentence about OSC21 on openSUSE Discord to events.o.o. so that people can get in if they lose their email registration.
#action ddemaio send out invite token as participants to attendees and speaker for speakers
#topics for next meeting
Schedule, video uploads, update status of Video Team, technical staff, volunteer, speaker guidelines (tips and tricks), test platform as participants, speaker and other roles | s3://commoncrawl/crawl-data/CC-MAIN-2022-21/segments/1652662519037.11/warc/CC-MAIN-20220517162558-20220517192558-00043.warc.gz | CC-MAIN-2022-21 | 2,429 | 31 |
https://www.blueboathome.com/web/migration/museum-nature | code | This afternoon I biked out to the Museum of Civilization, but I forgot my wallet and so had to come back to Leela's. I walked to the nearby Museum of Nature instead.
The Canadian Museum of Nature is under reconstruction right now, so more than half the exhibits are closed, but I enjoyed what I got to see; it's like a museum of natural history that focuses on Canada. They had a few exotic species in the "creepy crawlers" exhibit, but everything else was native. I particularly enjoyed the exhibit on medicinal plants. | s3://commoncrawl/crawl-data/CC-MAIN-2021-39/segments/1631780056752.16/warc/CC-MAIN-20210919065755-20210919095755-00426.warc.gz | CC-MAIN-2021-39 | 520 | 2 |
http://superuser.com/questions/523327/tv-stream-and-vpn/523334 | code | I'm trying to see a Spanish TV channel by Internet from Germany, but nor with a Spanish proxy and neither through VPN can I see this channel. I work with Ubuntu 12.10 and
This live TV channel can be seen in this youtube channel. If you try to play it from outside of Spain you can see a message saying: "this video isn't available".
Main question: how can I "get" a message saying the "technical" reason I can't to play it?
Secondary question: what is the diference between a TV channel (stream) and other type of "internet" service? I mean, why VPN works satisfactorily with other internet services but not with TV? | s3://commoncrawl/crawl-data/CC-MAIN-2016-30/segments/1469257828286.80/warc/CC-MAIN-20160723071028-00164-ip-10-185-27-174.ec2.internal.warc.gz | CC-MAIN-2016-30 | 616 | 4 |
https://acloudguru.com/forums/az-900-microsoft-azure-fundamentals/i-could-not-create-a-storage | code | Storage creation failed , since I received the error message: "Storage Creation failed" Error code: 403. What could be done about that?
I believe you are trying to create a storage account in a location that isn’t allowed. Can you try and create it in the same location as the resource group? And also make sure you are using the credentials provided by the lab.
Hi Lars, thank you for your response. Yes, I had to choose "Central US" as the location for the storage account. This worked then. | s3://commoncrawl/crawl-data/CC-MAIN-2023-23/segments/1685224649518.12/warc/CC-MAIN-20230604061300-20230604091300-00634.warc.gz | CC-MAIN-2023-23 | 495 | 3 |
https://github.com/randallm/coderam-beta | code | Coderam Beta is a Flask webapp that interacts with the GitHub and Wikipedia APIs to provide details on open source software projects on one easy-to-digest page. (140 character pitch: Wolfram Alpha for code) I build this at SF Code Day Feb. 2013, and it won me the Best Application award!
I'm not going to abandon this project just yet. It's a good base to start on in building a Chrome extension, for example... or at least, it will be after some refactoring. (This isn't the best or most elegant piece of code I've ever written, but that's to be expected after ~24hrs of non-stop coding | s3://commoncrawl/crawl-data/CC-MAIN-2018-17/segments/1524125945272.41/warc/CC-MAIN-20180421164646-20180421184646-00225.warc.gz | CC-MAIN-2018-17 | 587 | 2 |
https://www.rachelrosenson.com/fluxsaas | code | Following a successful alpha and beta, Flux.io became a subscription based service in November, 2016. I was a part of the core product team that researched, designed and implemented the pricing strategy and product.
What even is flux?
Flux is a tool for the Architecture, Engineerings and Construction (AEC) field to connect design tools and teams. While redesign the on boarding experience, it came to light that the issue with on boarding was not the act of creating an account, but the fact that users signing up were still unclear of what Flux was.
To answer these questions for myself and our internal team, I drafted up personas of our user base.
WHat's the plan?
There are many types of subscription as a service (SaaS) plans. Our product team began with market research and interviews with expert users to garner industry perspective, best practices, and user feedback.
Initially, we leaned towards a per project plan. Feedback was that the pricing structure was unusual and therefore it was hard for users to gage whether or not the price was appropriate. We also explored how many tiers of pricing would be appropriate. We decided to start with a standard monthly subscription fee with enterprise options for bulk seats.
How does it work?
Once the business plan was established, we began collaborating with front-end and back-end engineering to see what assets would be new, what existing code could we reuse, and what where the opportunity-cost trade offs of different systems and services.
What's the story?
How would I use it?
Users expressed that the hardest thing about learning new software is knowing where to start. I brainstormed with our internal team or Architectural Engineers to devise a list of 20 popular yet practical projects and work flows that could benefit from Flux.
Changing from free to subscription based service had a ripple affect throughout the product. One example is that the on-boarding experience and template emails would now all need to be updated. I took this opportunity to redesign the emails going from static copy to a more graphical interface. I created new illustrations that injected our brand into our messaging. All of our existing emails had to be updated, and many new emails needed to be written around billing information and account creation. During all this drafting, I realized that a content style guide would be useful so that anyone at the company could contribute. I conducted tone of voice workshops and created a content guide to ensure our brand consistency.
research + ux + product + prototyping + content copy + illustrations
Show don't tell our product story and launch a paid SaaS | s3://commoncrawl/crawl-data/CC-MAIN-2019-43/segments/1570987833089.90/warc/CC-MAIN-20191023094558-20191023122058-00064.warc.gz | CC-MAIN-2019-43 | 2,651 | 15 |
https://www.telerik.com/forums/contain-actionsheet-inside-container-like-in-demo | code | In the demo on this page: http://demos.kendoui.com/mobile/actionsheet/index.html
the ActionSheet stays inside the demo container. In my app, the actionsheet takes over the whole browser window. How is that done in the demo?
1 Answer, 1 is accepted
answered on 07 Jul 2012, 09:46 AM
The ActionSheet element is added to the Kendo Mobile Application container - that is the element on which the application is initialized - in the demos, this is #mobile-application-container, like this: window.kendoMobileApplication = new kendo.mobile.Application("#mobile-application-container");
All the best,
the Telerik team
Join us on our journey to create the world's most complete HTML 5 UI Framework - download Kendo UI now! | s3://commoncrawl/crawl-data/CC-MAIN-2021-49/segments/1637964363510.40/warc/CC-MAIN-20211208114112-20211208144112-00241.warc.gz | CC-MAIN-2021-49 | 714 | 8 |
http://www.advogato.org/person/robilad/diary/91.html | code | So, according to the last big news, the big question is no longer if, but how. This is how:
The closest collaboration point between free software Java developers outside Sun and the engineers on the other side would be the test suites. Since the free software projects like GNU Classpath and Apache Harmony, just like Sun's developers consider compatibility to be one of their prime goals, a merge of the respective test suite code to a common, free software, compatibility & regression test suite for the core Java specifications would be pretty useful for all implementors, as well as distributions using the DLJ.
Meanwhile, the simplest thing Sun could do to make open source Java reality a little sooner, and create a collaboration effort, would be to modify the current "Read Only" JCK license to allow compilation and execution of Sun's official test suite by independent implementations. | s3://commoncrawl/crawl-data/CC-MAIN-2017-17/segments/1492917123097.48/warc/CC-MAIN-20170423031203-00247-ip-10-145-167-34.ec2.internal.warc.gz | CC-MAIN-2017-17 | 894 | 3 |
https://community.tableau.com/thread/214026 | code | Can you please attach the workbook you were using?
An easy solve might be to just multiply your cents field by 100?
What exactly is going wrong? Is the normal curve not displaying properly or are you not getting anything at all?
If you aren't getting anything, you may need to transform your data slightly before using the INT( ) function:
INT ( [Profit] / 100 ) * 100
This converts each penny into an integer (0.01 x 100 = 1) before applying the INT function and rounding down.
I'm not getting anything at all.
Let me try and put together another workbook to post, I can't post the
current one because it has sensitive information in it.
On Wed, Aug 24, 2016 at 2:50 PM, Mason Forando <[email protected]
Hey guys, I think I got it! I just changed the Parameter range of values to a step size of 0.001, and kept the range from 0 to 1! It worked! | s3://commoncrawl/crawl-data/CC-MAIN-2019-13/segments/1552912202326.46/warc/CC-MAIN-20190320105319-20190320131319-00197.warc.gz | CC-MAIN-2019-13 | 845 | 11 |
https://thinkrf.com/content-open-approach-whitepaper/?utm_source=blog2 | code | A closed, proprietary approach to spectrum analysis no longer works. See how an open approach is changing the game for RF analysis equipment.
How an open, interoperable approach to RF analyzers allows you to build better spectrum analysis solutions.
"The way it’s always been done is no longer good enough, and users can no longer depend on closed solutions with little flexibility. Instead, today’s wireless landscape demands open, interoperable spectrum analysis solutions that make it possible to work with hardware, software, APIs, and programming environments from multiple vendors." | s3://commoncrawl/crawl-data/CC-MAIN-2023-23/segments/1685224649105.40/warc/CC-MAIN-20230603032950-20230603062950-00462.warc.gz | CC-MAIN-2023-23 | 592 | 3 |
http://unifiedpeople.ru/exchhelp.en/html/7e58395a-1230-4c70-91c2-ce464f173ce2.htm | code | Applies to: Exchange Server 2007
Topic Last Modified: 2007-06-11
Use the Schedule page of the New Address List wizard to specify when the address list should be applied. You can also specify the amount of time that the tasks should run.
- Apply the address list
- Immediately Click this button to apply
the address list as soon as it is created.
- At the following time Click this button
and use the corresponding drop-down lists to specify a time to
apply the new address list.
- Immediately Click this button to apply the address list as soon as it is created.
- Cancel tasks that are still running after (hours)
Select this check box and use the corresponding text box to specify how long the new address list task will run. The default is 8 hours. | s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703531429.49/warc/CC-MAIN-20210122210653-20210123000653-00471.warc.gz | CC-MAIN-2021-04 | 751 | 12 |
http://game-game.com/23546/ | code | The essence of the game is what you are going through your attentiveness and intelligence to drill a fairly large number of holes in the ground. Each correct poke a hole you'll be paid a certain amount of money. In the lower part of the game you will see two indicators. At the top of the game you can see the total amount of your monthly money. You will have a limited amount of fuel, so you will sometimes fill up in this game. | s3://commoncrawl/crawl-data/CC-MAIN-2018-05/segments/1516084891886.70/warc/CC-MAIN-20180123091931-20180123111931-00601.warc.gz | CC-MAIN-2018-05 | 429 | 1 |
https://www.mail-archive.com/[email protected]/msg05143.html | code | Does anyone know of a function to plot a geologic time scale as a series of
concentric circles on a circularly plotted tree?
As far as I can tell there are three available functions that can do this on a
But none of these works with a circular tree, as far as I can tell. It
shouldn’t be too hard to code this manually as a series of concentric circles
(going try that now), but I figured I’d ask here in case someone has already
R-sig-phylo mailing list - [email protected]
Searchable archive at http://[email protected]/ | s3://commoncrawl/crawl-data/CC-MAIN-2018-09/segments/1518891814787.54/warc/CC-MAIN-20180223134825-20180223154825-00536.warc.gz | CC-MAIN-2018-09 | 540 | 8 |
https://www.iptaxsolutions.co.uk/tag/technology/ | code | What is the best technology stack in a modern growth focused business?
Here's a catch-up on some apps and software that we use in our modern tax & accountancy firm and that could (or should) be used in a savvy growth focused business
🎉 Great! You've successfully subscribed.
🎉 Great! Next, complete checkout for full access.
👋 Welcome back! You've successfully signed in.
🎉 Success! Your account is fully activated, you now have access to all content. | s3://commoncrawl/crawl-data/CC-MAIN-2021-25/segments/1623488528979.69/warc/CC-MAIN-20210623011557-20210623041557-00198.warc.gz | CC-MAIN-2021-25 | 463 | 6 |
http://www.the-medium-is-not-enough.com/about_this_blog/publishing_schedule.php | code | Every weekday morning before 9am, I put together the daily news. After that, it's a bit random. I try to put together at least one extra item per day, but work being what it is, I can't always do that. But it won't be more than three days between extra items and if I've been a bit slack during the week, I'll try to make it up at the weekend.
About this blog | s3://commoncrawl/crawl-data/CC-MAIN-2017-17/segments/1492917121121.5/warc/CC-MAIN-20170423031201-00502-ip-10-145-167-34.ec2.internal.warc.gz | CC-MAIN-2017-17 | 359 | 2 |
https://www.montana.edu/calendar/events/39417 | code | Lucy Williams' Comprehensive Exam: Flow Decomposition Algorithms for Multiassembly Problems
- Wednesday, October 6, 2021 from 10:00am to 11:00am
Current genetic sequencing technologies allow for fast and cheap measurement of short substrings of genetic sequence called reads, which must be assembled to recover the full unknown sequence. In some cases, such as when assembling RNA transcripts or the genomes of a mixture of species taken in a single sample, the reads come from multiple sequences. In this case, we would like to recover all of the distinct unknown sequences and their relative abundances, and we call this multiassembly. A common model underlying many multiassembly approaches is flow decomposition, which decomposes a flow network into a set of paths and weights that parsimoniously explains the flow. In previous work, we formalized two new variations on flow decomposition to better model the information available in multiassembly tasks. The first, inexact flow decomposition, allows for some uncertainty in the flow measurements. The second, flow decomposition with subpath constraints, incorporates additional information that may be provided by longer reads. We give heuristic and FPT algorithms to solve these problems, and demonstrate their usefulness for RNA assembly on a simulated RNA-Seq dataset.
We propose to extend our existing work in two ways. First, we will apply the flow decomposition with subpath constraints model to real data from RNA transcript assembly and mixed-genome assembly experiments, which will require us to compute flows and subpath constraints from real data and allow us to validate the model on more realistic inputs. This process may also yield new, biologically-inspired variants on the problem. Second, we will use a currently-being-developed integer linear program (ILP) for flow decomposition to further validate the method on larger instances. The ILP appears to be much faster than our existing exact solvers. We will also give extended formulations of the ILP that solve additional problem variants. | s3://commoncrawl/crawl-data/CC-MAIN-2021-43/segments/1634323587608.86/warc/CC-MAIN-20211024235512-20211025025512-00637.warc.gz | CC-MAIN-2021-43 | 2,063 | 4 |
https://www.indiedb.com/news/a-nifty-green-light | code | When we started work on our nifty little game, there was this brand new thing Valve was trying out which would enable indie developers to get their game on Steam if the community wanted it. Steam Greenlight they called it an we thought "that's an interesting idea, once our game is mature enough, we should try that". Today, many years later, the day has finally come. We are happy to announce our very first, very own greenlight campaign which you can find on greenlight.aniftygame.com. We would greatly appreciate it if you could vote for us and maybe leave a comment to tell us what you think. Thank you for your support.
a nifty green light
'a nifty game' is now on Steam Greenlight, let us know what you think!
Posted by 1uc4r0 on | s3://commoncrawl/crawl-data/CC-MAIN-2023-40/segments/1695233510994.61/warc/CC-MAIN-20231002100910-20231002130910-00193.warc.gz | CC-MAIN-2023-40 | 735 | 4 |
http://howtocode.net/2015/04/windows-apple/ | code | First forgive me for any errors because I am on mobile. So I am creating an application for a small to medium sized business that basically controls a large part of their business as well as a database to go with it. My problem is not actually with that aspect. I got a phone call today saying that instead of building the application around Windows I need to build it around Apple as they are updating their system to be all Apple machines. I have no problem with this but I was currently halfway done with the system which was windows/oracle base.
The question: is it possible for me to finish the system and somehow port it over for Apple or will it be faster to tear down what I have and rebuild in visual studios using xarimin?
by Forumrider4life via /r/csharp | s3://commoncrawl/crawl-data/CC-MAIN-2018-17/segments/1524125937780.9/warc/CC-MAIN-20180420120351-20180420140351-00054.warc.gz | CC-MAIN-2018-17 | 765 | 3 |
https://forums.sqlteam.com/t/generate-number-from-word/1723 | code | Hope someone can help.
I need to generate (at runtime so no functions or stored procedures allowed) a number from a string.
Let me elaborate. I am having to use a table to get the names of suppliers. The person who designed the table, for reasons known only to him, thought it was best to allow the system to generate a primary key for each supplier.
Except this table gets dropped and recreated every night.........so, today Aardvark Pet Supplies might have ID=1 and Acturis Builders might have ID=2. But if a new supplier is created called Abacus Software, when the table is recreated then that would then have ID=2 and Acturis Builders would have ID=3.
So I'm having to use the supplier name for part of a composite key and I'd rather use a numeric identifier.
Is there a way to take Aardvark Pet Supplies and create a unique number from the letters?
I have been messing about with ASCII and SUBSTRING but seem to be stuck.
Any advice greatly appreciated. | s3://commoncrawl/crawl-data/CC-MAIN-2019-09/segments/1550247481428.19/warc/CC-MAIN-20190217010854-20190217032854-00621.warc.gz | CC-MAIN-2019-09 | 958 | 8 |
http://www.peachparts.com/shopforum/349889-post1.html | code | STRANDED!! Emergency 420SEL starter please help
Help, we are in the middle of nowhere (Primm Nevada) and the starter just makes a click. Mercedes roadside assistance says it is the starter. Please tell me how much I should have to pay for the part and how to install it.
Thank you in advance.
Last edited by fredsegal; 02-23-2003 at 11:23 AM. | s3://commoncrawl/crawl-data/CC-MAIN-2016-44/segments/1476988719960.60/warc/CC-MAIN-20161020183839-00503-ip-10-171-6-4.ec2.internal.warc.gz | CC-MAIN-2016-44 | 342 | 4 |
https://ubuntuforums.org/showthread.php?t=1780154&page=2 | code | Good! Could you please change all of them to "Serial" and see if the tablet still works. I'm trying to figure out if the code is case specific, i.e. requires "SERIAL" for version 2 of the 0.10.7 patch. If it is we would need a small change to a line in the code. If not we're good to go.
Changed one of them, works OK.
Alright, that could be it. I did find another report on Ubuntu forums about that error. I thought that was because he was using a serial-to-usb converter and had some other issues. I'm assuming you found the fix here: http://forums.opensuse.org/english/g...cs-tablet.html
I pasted the stuff into a file and applied chmod+x and run it. And the file is OK.
-- I realize now however that this might cause errors, because pasting stuff directly onto command line uses same set of environmental variables.
Hopefully we don't need the fix. But if we do have to generate a new aclocal.m4 because we've added new files and whatnot no big deal. I can just add:
to the instructions.
aclocal && automake && autoconf | s3://commoncrawl/crawl-data/CC-MAIN-2017-09/segments/1487501174163.72/warc/CC-MAIN-20170219104614-00459-ip-10-171-10-108.ec2.internal.warc.gz | CC-MAIN-2017-09 | 1,023 | 8 |
https://www.oreilly.com/library/view/tuscany-sca-in/9781933988894/kindle_split_023.html | code | Chapter 11. Running and embedding Tuscany
This chapter covers
- Understanding basics of the Tuscany runtime environment
- Running Tuscany in a standalone, web, or distributed environment
- Embedding Tuscany within a managed application container
Tuscany can be hosted in many different environments. It can be run standalone or invoked programmatically as a library. Tuscany is integrated with web containers such as Tomcat, Jetty, and WebSphere to enable web applications with SCA. Tuscany can also be embedded within other application containers such as Java EE or OSGi-based containers. When running Tuscany standalone, you use the Tuscany launcher to run an SCA application from the command line or within an IDE. In the other cases, the host environment ... | s3://commoncrawl/crawl-data/CC-MAIN-2022-40/segments/1664030333541.98/warc/CC-MAIN-20220924213650-20220925003650-00661.warc.gz | CC-MAIN-2022-40 | 762 | 6 |
https://trac.filezilla-project.org/query?status=closed&max=3&page=146&col=id&col=resolution&col=summary&col=owner&col=reporter&desc=1&order=summary&row=description | code | Custom Query (8143 matches)
Results (436 - 438 of 8143)
|#10891||invalid||recovered OS now cannot access ftp client account|
All my info of my account is listed here this is a very old account. Would like to freshen up and start a updated account.
Thank you for verifying. The login information is as follows: ftp2096674 Username: fredsi chged ftp Password: V7L9c#z9
address for explorer transfer ftp://fredsi:[email protected]
Thank you for verifying. The login information is as follows: host: fredsign.com Username: fredsi Password: address for explorer transfer ftp://fredsi:[email protected]
new transfer code ftp://fredsi:[email protected]/ connection address 18.104.22.168:21
Your new IP address is 22.214.171.124. newest IP addresss 2600:8806:4802:900:44c7:755b:e514:54
|#4785||rejected||reconnect to last server fails to forget|
I'm pretty sure this is a bug. When I use the options in the quick connect button 1) to erase the contents of the quick connect bar and 2) to clear the quick connect history, the Reconnect to Last Server button in the toolbar still recalls the server that I just erased.
This has to be a bug. Erasing the quick connect bar and its history should also remove that information from the log. Otherwise, the features are useless.
Even worse, the reconnect button remembers the passwords! Even worse, the reconnect button remembers everything even after a restart.
This is bad! This is a disaster for those of us who are on public or work computers!
|#362||received filesize is a little bit smaller than that on serve|
When Download a Large File suppose a file on server have size as (say) 20822112 but the d/l one will be (say) 20818104
It needs resume for the file can be fully d/l | s3://commoncrawl/crawl-data/CC-MAIN-2023-40/segments/1695233510810.46/warc/CC-MAIN-20231001073649-20231001103649-00683.warc.gz | CC-MAIN-2023-40 | 1,729 | 17 |
https://www.christysverre.com/floral-collection/view/7234051/2/7881203/7973149 | code | I have a strong affinity for wildflowers. There's a certain untamed essence about them, a resilient attitude that says, "I'll thrive here regardless of any obstacles." I find that captivating. Wildflowers possess a unique identity of their own. My perspective on them is expressed through a vibrant and abstract energy. | s3://commoncrawl/crawl-data/CC-MAIN-2024-18/segments/1712296818711.23/warc/CC-MAIN-20240423130552-20240423160552-00466.warc.gz | CC-MAIN-2024-18 | 319 | 1 |
https://www.physicsforums.com/threads/hurdles-of-laymann-misunderstanding-of-mathematics.2726/ | code | Because about 50% of people here in Physics Forum (including even some mentors) don't understand origin of math and keep parroting that math is human descriptive construct (see results of poll), I found educational to point to theory which could unearth deepest layer of reality: http://www.sciam.com/article.cfm?articleID=0007E95C-9597-1DC9-AF71809EC588EEDF&pageNumber=1&catID=2 According to this math (which is the only one to unite GR with QM), not only all matter around us, but space and time themselves are just products of math (of geometry, to be more accurate). This theory deals with metricless objects (Penrose spinors) and quantizying them obtains space and time, gravity and matter. So, Mentat - what do you say now? Just acknowledge that the hurdles you posted about mathematics were a result of your layman misunderstanding of mathematics and of its origin. Laymann "logic" is useless in understanding nature. Math is the key. | s3://commoncrawl/crawl-data/CC-MAIN-2019-04/segments/1547584520525.90/warc/CC-MAIN-20190124100934-20190124122934-00431.warc.gz | CC-MAIN-2019-04 | 941 | 1 |
https://nursingdunia.com/fi2jj8/azuydy.php?ca1e6c=list-five-best-practices-in-software-development | code | Other times, they introduce a new feature into the old version of the app. Updating OS components versions frequently is typically the best way to deal with bugs and vulnerabilities. As to the best practices for choosing DevOps tools you can use to approach your DevOps implementation, these can … You will have to go back to the beginning: figuring out the logic all over again, and then rewriting the entire block of code. Best Software Development Tools and Platforms a Developer Should Know: Know which Software Tools developers use for developing the latest and modern feature-rich projects. Automated testing is something almost universally agreed upon as a laudable goal. The idea is to reduce unnecessary complexity. When used in combination they strike at the root causes of software development problems. Such product … Keep them separate. In this post, we take a look at the evolution of infrastructure as code (IaC). Managing Projects - Best Practices Checklist. This will help launch applications quickly and with a small budget. At one point or another, everything touches the development process. We have covered parts of the basic steps of the development life cycle – some great coding practices, testing and documentation methods, and their numerous advantages. We have covered parts of the basic steps of the development life cycle – some great coding practices, testing and documentation methods, and … These best practices for data warehouse development will increase the chance that all business stakeholders will derive greater value from the data warehouse you create, as well as lay the groundwork for a data warehouse that can grow and adapt as your business needs change. Coherence Another key element is a formal change-process to modify requirements on the fly after each sprint begins. What if they want to make some changes because of new or modified requirements? By the same token, too much time may cause developers to procrastinate, and too big of a budget may cause unnecessary spending. Let us know — join our Community Forum, below! The projects that you work on might also end up in someone else’s hands in the future. The seven Lean principles (in this order) are: eliminate waste, amplify learning, decide as late possible, deliver as fast as possible, empower the team, build integrity in, and see the whole. Before coding starts, it is important to ensure that all necessary prerequisites have been completed (or have at least progressed far enough to provide a solid foundation for coding). Learn from enterprise dev and ops teams at the forefront of DevOps. Other times, we may get a little complacent, thinking there’s nothing that could possibly go wrong. Top 5 Customer Service Best Practices ... Research shows that follow-up is the best way to create customer loyalty. This is a guest post from Ava Franklin of GoodCore Software. Whether you’re new to Agile and looking to persuade colleagues about the benefits, or are already using Agile and hoping to improve your team’s workflows, our Agile guide will provide you with tangible lessons to apply in your team.. For more information, download our free white paper, Staying Agile: 5 Best Practices in Software Project Management. 3. 2. Firstly, it will help you clearly define exactly what you are supposed to do. A final word -- training the trainer. Whether you work independently or are part of a proper bespoke software development company, these tips will help you keep your work simple and hasslefree so you can enjoy doing what you do. 1) Shift IT from Fire-Fighting Mode to Active Monitoring. In addition to identifying bugs earlier in the development process, it helps developers learn from each other so they can improve their coding skills. We have covered Software Development tools in the following categories. 1. 5 Software Development Process Best Practices 1. Best Practice #1 – Use an abstraction layer to remove dependencies One of the common issues with code bases I review is that developers tightly couple their application code with the software libraries they use. Image by : Opensource.com. Best practices for systems and software development. To help your software team build out a comprehensive development process, here are five additional best-practice recommendations: Establish the “Minimum Viable Product” or MVP that your software projects need to achieve. Simpler answers are usually more correct, and this thought perfectly meets the needs of the development process. Table of Contents [ hide] 1 20 Best Practices for Software Development Results. Subscribe now . Sign up to get the latest news and updates. Your registration has been confirmed. Start With Requirements. Software development and IT operations teams are coming together for faster business results. Table of Contents hide. An introduction to IBM Rational Solutions for Systems and Software. Once upon a time, corporate training was almost a timeout day for employees -- often held outside, or within the confines of a training room, it would go on for a set duration, such as half a day or two … To learn more about how Tiempo can help you achieve a higher ROI on your next software initiative please contact us. 1.1 Recruit the Right Resources: 1.2 Select the Right Development Process: 1.3 Make Sound Estimations: 1.4 Define Smaller Milestones: 1.5 Define Requirements: 1.6 Define System Architecture: 1.7 Optimize Design: All Rights Reserved. It’s also a good idea to define all the business, market, user and technical requirements prior to beginning each sprint. The community is extremely helpful and will be happy to provide expert opinions to you. Extreme programming (XP) is a software development methodology which is intended to improve software quality and responsiveness to changing customer requirements. to the cloud. In the form of one large piece, the project would have to be assigned to one person who would have to do everything on their own in a linear fashion. Dialexa We are on a mission to make every company a great technology company. As a project grows in size, it becomes more and more complex and hard to manage. The idea is to reduce unnecessary complexity. That is why coding repositories are a crucial part of the software development process. [ebook] Best Practices for Implementing Remote Work Solutions, Based on Microsoft Cloud Platform. Top five software development best practices. Every app that you use continuously keeps getting updated every once in a while.
Buck 110 Knife Value, Homemade Walleye Lures, Canon 250d Specifications, Psychiatric Interviews For Teaching: Anxiety, How To Clean Mold Spores From Air, Tea Tree Leave In Conditioner Spray, Public Drinking Water Fountains Uk, Chicken Caprese Sandwich Habit, Weather In Greece In March In Fahrenheit, Oster Tssttvfddg-r French Door Toaster Oven, Extra Large, Red, Landscape Architecture Job Description, Hogwarts Astronomy Tower Lego, Royal Dansk Luxury Wafers Chocolate, | s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703518240.40/warc/CC-MAIN-20210119103923-20210119133923-00571.warc.gz | CC-MAIN-2021-04 | 7,003 | 2 |
https://cloud.google.com/kubernetes-engine/docs/tutorials/authenticating-to-cloud-platform | code | This example uses Cloud Pub/Sub, although the instructions can be applied to any Cloud Platform service. The example application in this tutorial authenticates to Cloud Pub/Sub using a service account and subscribes to messages published to a Pub/Sub topic from a Python-based application.
This tutorial covers the following steps:
- How to create a service account
- How to assign necessary roles for your service account to work with Cloud Pub/Sub
- How to save the account key as a Kubernetes Secret
- How to use the service account to configure and deploy an application.
The sample application used in this tutorial subscribes to a Pub/Sub Topic and
prints the messages published to the standard output. You must configure the
application with correct permissions, use
gcloud to publish messages and
inspect the container’s output stream to observe the messages are received
Why use Service Accounts?
Each node in a container cluster is a Compute Engine instance. Therefore, applications running on a container cluster by default inherit the scopes of the Compute Engine instances to which they are deployed.
Google Cloud Platform automatically creates a service account named “Compute Engine default service account” and associates it with any compute instances that are created. Depending on how your project is configured, the default service account may or may not have permissions to use other Cloud Platform APIs. Updating this account’s permissions is not the recommended way to authenticate to other Cloud Platform services from the Kubernetes Engine.
The recommended way to authenticate to Google Cloud Platform services from applications running on Kubernetes Engine is to create your own service accounts. Having service accounts that are distinct from the “Compute Engine default service account” provides the following benefits:
Better visibility into, and auditing of, the API requests your application makes.
The ability to revoke keys for particular applications, instead of sharing a service account and having to revoke API access of all applications at once.
Reduced exposure in case of a potential security incident where the credentials of the service account are compromised.
Before you beginTake the following steps to enable the Google Kubernetes Engine API:
- Visit the Kubernetes Engine page in the Google Cloud Platform Console.
- Create or select a project.
- Wait for the API and related services to be enabled. This can take several minutes.
- Enable billing for your project.
Install the following command-line tools used in this tutorial:
gcloudis used to create and delete Kubernetes Engine clusters.
gcloudis included in the Google Cloud SDK.
kubectlis used to manage Kubernetes, the cluster orchestration system used by Kubernetes Engine. You can install
gcloud components install kubectl
Set defaults for the
To save time typing your project ID and Compute Engine zone
options in the
gcloud command-line tool
gcloudcommand-line tool, you can set default configuration values by running the following commands:
gcloud config set project [PROJECT_ID] gcloud config set compute/zone us-central1-b
Enable Pub/Sub API
For this tutorial, you need to enable the Cloud Pub/Sub API on your project as the sample application uses the Pub/Sub API to receive messages from the Pub/Sub topic:
Create a container cluster
Create a container cluster named
pubsub-test to deploy the Pub/Sub subscriber
gcloud container clusters create pubsub-test
Step 1: Create a Pub/Sub topic
The Pub/Sub subscriber application you will deploy uses a subscription named
echo-read on a Pub/Sub topic called
echo. Create these resources before
deploying the application:
gcloud pubsub topics create echo gcloud pubsub subscriptions create echo-read --topic=echo
Step 2: Deploy Pub/Sub subscriber application
The next step is to deploy the application container to retrieve the messages published to the Pub/Sub topic. This application is written in Python using Google Cloud Pub/Sub client libraries and you can find the source code on GitHub.
The following manifest file describes a Deployment that runs a single instance of this application’s Docker image:
apiVersion: apps/v1beta1 kind: Deployment metadata: name: pubsub spec: template: metadata: labels: app: pubsub spec: containers: - name: subscriber image: gcr.io/google-samples/pubsub-sample:v1
To deploy this manifest, download it to your machine as
pubsub.yaml, and run:
kubectl apply -f pubsub.yaml
Once the application is deployed, query the pods by running:
$ kubectl get pods -l app=pubsub NAME READY STATUS RESTARTS AGE pubsub-2009462906-1l6bh 0/1 CrashLoopBackOff 1 30s
You can see that the container is failing to start and went into a
CrashLoopBackOff state. Inspect the logs from the Pod by running:
$ kubectl logs -l app=pubsub ... google.gax.errors.RetryError: GaxError(Exception occurred in retry method that was not classified as transient, caused by <_Rendezvous of RPC that terminated with (StatusCode.PERMISSION_DENIED, Request had insufficient authentication scopes.)>)
The stack trace and the error message indicates that the application does not have permissions to query the Cloud Pub/Sub service. This is because the “Compute Engine default service account” is not assigned any roles giving it permission to Cloud Pub/Sub.
To avoid using the node instance default service account or sharing a service account in multiple services, you must create a service account specifically for this application.
Step 3: Create service account credentials
To give the applications running on Kubernetes Engine access to Google Cloud Platform services, you need to use service accounts.
To create service account, go to Service Accounts on GCP Console and click Create Service Account:
- Specify a Service Account Name (for example,
- In the Role dropdown, select “Pub/Sub → Subscriber”.
- Check Furnish a new private key and choose key type as JSON.
- Click Create.
Once the service account is created, a JSON key file containing the credentials of the service account is downloaded to your computer. You will use this key file to configure the application to authenticate to the Cloud Pub/Sub API.
Step 4: Import credentials as a Secret
Kubernetes offers the Secret resource type to store credentials inside the container cluster and use them in the applications deployed on Kubernetes Engine directly.
To save the JSON key file a Secret named
pubsub-key, run the following command
with the path to the downloaded service account credentials file:
kubectl create secret generic pubsub-key --from-file=key.json=<PATH-TO-KEY-FILE>.json
This command creates a Secret named
pubsub-key that has a
key.json file with
the contents of the private key you downloaded from GCP Console. Once
you create the Secret, you should remove the key file from your computer.
Step 5: Configure the application with the Secret
To use the
pubsub-key Secret in your application, you need to modify the
Deployment specification to:
- Define a volume with the secret.
- Mount the secret volume to the application container.
- Set the
GOOGLE_APPLICATION_CREDENTIALSenvironment variable to point to the key file in the secret volume mount.
The updated manifest file looks like the following:
apiVersion: apps/v1beta1 kind: Deployment metadata: name: pubsub spec: template: metadata: labels: app: pubsub spec: volumes: - name: google-cloud-key secret: secretName: pubsub-key containers: - name: subscriber image: gcr.io/google-samples/pubsub-sample:v1 volumeMounts: - name: google-cloud-key mountPath: /var/secrets/google env: - name: GOOGLE_APPLICATION_CREDENTIALS value: /var/secrets/google/key.json
This manifest file defines the following to make the credentials available to the application:
a volume named
google-cloud-keywhich uses the Secret named
a volume-mount, making the
/var/secrets/googledirectory inside the container
GOOGLE_APPLICATION_CREDENTIALSenvironment variable set as
/var/secrets/google/key.json, which will contain the credentials file when the secret is mounted to the container as a volume
Note that the
GOOGLE_APPLICATION_CREDENTIALS environment variable is
automatically recognized by Google Cloud client libraries, in this case
the Cloud Pub/Sub client for Python.
To deploy this manifest, download it to your machine as
pubsub-with-secret.yaml, and run:
kubectl apply -f pubsub-with-secret.yaml
Once it is deployed correctly, the Pod status should be listed as
$ kubectl get pods -l app=pubsub NAME READY STATUS RESTARTS AGE pubsub-652482369-2d6h2 1/1 Running 10 29m
Step 6: Test receiving Pub/Sub messages
Now that you configured the application, publish a message to the Pub/Sub Topic
gcloud pubsub topics publish echo --message="Hello, world!"
Within a few seconds, the message should be picked up by the application and printed to the output stream. To inspect the logs from the deployed Pod, run:
$ kubectl logs -l app=pubsub Pulling messages from Pub/Sub subscription... [2017-06-19 12:31:42.501123] ID=130941112144812 Data=Hello, world!
You have successfully configured an application on Kubernetes Engine to authenticate to Pub/Sub API using service account credentials!
To avoid incurring charges to your Google Cloud Platform account for the resources used in this tutorial:
Clean up the Pub/Sub subscription and topic:
gcloud pubsub subscriptions delete echo-read gcloud pubsub topics delete echo
Delete the container cluster:
gcloud container clusters delete pubsub-test | s3://commoncrawl/crawl-data/CC-MAIN-2018-09/segments/1518891815435.68/warc/CC-MAIN-20180224053236-20180224073236-00016.warc.gz | CC-MAIN-2018-09 | 9,484 | 116 |
https://www.alibabacloud.com/help/doc-detail/60945.htm | code | Container Registry helps you manage images throughout the entire lifecycle by providing secure application image hosting capability, accurate image security scan feature, stable image build service, and convenient image authorization feature. Container Registry simplifies the build and Operation & Maintenance of Registry, supports image hosting in multiple regions, and integrates with cloud products such as Container Service, providing a one-stop solution for using Docker in the cloud.
- You can create or delete an image repository in different regions based on your business needs.
- Each image repository provides the corresponding network address under the public network, intranet, and Virtual Private Cloud (VPC).
- Container Registry supports the convenient image security scan feature, which displays detailed image layer information.
- Container Registry provides image vulnerability reports, which shows multi-dimensional vulnerability information such as vulnerability number, vulnerability level, and fix versions.
- Container Registry supports the source code build of GitHub, Bitbucket, and self-built GitLab.
- Container Registry supports automatic build. The new Docker images are automatically built after the source code is changed.
- Integrated with GitHub, Bitbucket, and self-built GitLab, Container Registry can automatically build new images after the compile and test from source code to applications.
- Integrated with Container Service, after new images are built, Container Registry can easily deploy these images to Container Service clusters. | s3://commoncrawl/crawl-data/CC-MAIN-2019-43/segments/1570986658566.9/warc/CC-MAIN-20191015104838-20191015132338-00060.warc.gz | CC-MAIN-2019-43 | 1,576 | 9 |
https://www.theocc.com/market-data/open-interest/ | code | Daily Open Interest
Daily open interest statistics and totals for Equity, Index and Futures products. Five years of rolling historical data available in HTML, CSV and TXT formats.
Futures Open Interest
Daily futures open interest data by trading symbol including contract expiration and exchange information. Five years of rolling historical data available in HTML, CSV and TXT formats.
OTC Open Interest
Weekly cleared OTC open interest data by symbol including market value and notional value information. The report is generated during the OTC finalization on the last business day of each week. Four weeks of rolling historical data available in TXT format.
Historical Volume Statistics
Historical volume and open interest statistics offering daily statistics by month, monthly statistics and annual volume data from 1973 to present. Daily statistics by month and monthly statistics offer five years of rolling historical data available in HTML, CSV and TXT formats.
Series information query on underlying and option symbol. Includes trading exchange, series/contract date, strike, ticker and open interest. Report viewable in HTML format. | s3://commoncrawl/crawl-data/CC-MAIN-2017-17/segments/1492917119838.12/warc/CC-MAIN-20170423031159-00143-ip-10-145-167-34.ec2.internal.warc.gz | CC-MAIN-2017-17 | 1,143 | 9 |
https://bartslota.com/about-me/ | code | I am a software engineer, consultant, mentor and trainer at Bottega IT Minds. I am also a public speaker, and I blog from time to time. I have gained experience working in numerous industries, including health care, telco, marketing, finance, and energy, for companies of sizes from tens to thousands of people.
I am passionate about OOP, Domain Driven Design, software craftsmanship, microservices and software architecture in general. I am an enthusiast of EventStorming workshops.
In personal life – I am a husband and a father. I love heavy metal music, guitar playing, angling and trekking. | s3://commoncrawl/crawl-data/CC-MAIN-2024-10/segments/1707947476399.55/warc/CC-MAIN-20240303210414-20240304000414-00185.warc.gz | CC-MAIN-2024-10 | 597 | 3 |
https://forum.radxa.com/t/any-gpu-drivers-on-rk3568/10668 | code | I spent a lot of time trying to solve the problem of Qt5.15.5 with openGL cross compile. Using armbian image from here:https://github.com/amazingfate/armbian-rock3a-images/
Maybe it’s a problem with the g++ version. The cross toolchain I used at the beginning was 9.4.0 in ubuntu20.04. There were variety errors when compiling Qt, because the versions in the armbian image were g++ 11.2.0 and ubuntu22. So I changed to aarch64-linux-gnu 11.2.2 and Ubuntu22.04 on my x86 PC, but there are still some errors when
-sysroot is set. I tried to compile it all on rock3a, finally, this time there was no error, but it took nearly 5 hours to compile Qt, and no error was reported when compiling opencv, but at the end of compiling my own app, the following error appeared.
qt.qpa.plugin: Could not find the Qt platform plugin "xcb" in ""
This application failed to start because no Qt platform plugin could be initialized. Reinstalling the application may fix this problem.
Available platform plugins are: eglfs, linuxfb, minimal, minimalegl, offscreen, vnc, webgl.
locate libqxcb.so to find the lib ane got
There are some libraries of Qt5 in the amrbian image. Then I set the
there are part of output:
Cannot load library /usr/lib/aarch64-linux-gnu/qt5/plugins/platforms/libqxcb.so: (/lib/aarch64-linux-gnu/libQt5XcbQpa.so.5: undefined symbol: _ZN23QPlatformVulkanInstance22presentAboutToBeQueuedEP7QWindow, version Qt_5_PRIVATE_API)
QLibraryPrivate::loadPlugin failed on "/usr/lib/aarch64-linux-gnu/qt5/plugins/platforms/libqxcb.so" : "Cannot load library /usr/lib/aarch64-linux-gnu/qt5/plugins/platforms/libqxcb.so: (/lib/aarch64-linux-gnu/libQt5XcbQpa.so.5: undefined symbol: _ZN23QPlatformVulkanInstance22presentAboutToBeQueuedEP7QWindow, version Qt_5_PRIVATE_API)"
So I am trying to compile this lib:( | s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882570827.41/warc/CC-MAIN-20220808122331-20220808152331-00635.warc.gz | CC-MAIN-2022-33 | 1,801 | 12 |
https://itknowledgeexchange.techtarget.com/itanswers/record-locking-2/ | code | First off, The way I read your question tells me you want to Clear a File and after that check to see who is using it. Is that correct?
If so, could you explain why.
I would think you would do it in the reverse order. Make sure no one is using before the clear.
Anyway, here are some options.
Prior to calling the RPG program, you can check for locks and/or ALCOBJ.
Or, Once in the RPG program, you can call a CL program or procedure that would check for locks and return a parm. The CL program could create a file or Data Queue that could be used by the calling program to send message to users. | s3://commoncrawl/crawl-data/CC-MAIN-2019-51/segments/1575540555616.2/warc/CC-MAIN-20191213122716-20191213150716-00061.warc.gz | CC-MAIN-2019-51 | 596 | 6 |
http://forextechnicalanalysis42974.free-blogz.com/8830483/new-step-by-step-map-for-crypto-signals | code | The gaming industry will probably get a vital influx of funds through the virtual currency globe. A cryptocurrency precisely suitable for the gaming...
A brutal October has led into an extremely bullish November, not less than to this point, for the British Pound. But with prices nearing a big zone of resistance, can consumers carry on to push? Browse A lot more
Combined cap of cryptocurrencies read a record substantial of roughly US $830 million on January seventh of this calendar year. Above the subsequent thirty day period, we...
But are unable to mathematicians use someone's public crucial to somehow work out what the matching private key is? No. If the public essential is long sufficient, It is really a kind of computationally infeasible responsibilities. And the general public vital may be built assuming that security involves.
Genuine Time Technical Analysis Summary Actual time technical analysis overview for the key currency pairs. This analysis is a comprehensive summary derived from basic and exponential shifting averages in addition to key technical indicators proven for particular time intervals. You'll be able to customize the table by deciding upon your very own preferable forex pairs. Begin Trading
You may have read news accounts of the University of California Berkeley college student who a short while ago decrypted a concept which was encrypted having a 40-little bit key applying 250 workstations as Portion of a contest from RSA Inc.... If that Berkeley pupil was faced with an RSA-provided undertaking of brute forcing a single PGP-dependent (128-bit vital) encrypted message with 250 workstations, it will get him an believed nine trillion situations the age from the universe to decrypt just one information.
Encryption application isn't like standard computer software: if there's a little flaw in standard software package, it may only suggest that in specific instances a spell checker does not catch a miscalculation, or perhaps the keyboard locks up in certain exceptional conditions.
They would like to assist in "making the infrastructure of e-commerce." Properly, they will test. But there are a few problems with masses of individuals counting on digital signatures. Here is how I put it this month to a mailing checklist: ************************************
To realize extra Perception to how we use sentiment to power our buying and selling, sign up for us for our weekly Investing Sentiment webinar.
Mining a currency including Bitcoin or Ethereum will involve connecting desktops to a global community and working with them to resolve advanced mathematical puzzles.
What about 9/11? I am unable to see any rationale to vary anything, or consider nearly anything down. All of this substance is perfectly-regarded, published in guides, and it's all over the place... if rather scattered. If terrorists use the most crucial approach talked about here (PGP), they'd get noticed like anyone pulling over a black balaclava and walking as a result of an airport. And bring down targeted visitors analysis on all their communications.. the sort of chatter index the White Residence talks about. Exactly the same for another crypto methods. Except steganography, that has been much mentioned on the net already -- to be a doable sweet process for terrorists -- but I don't do much more than determine what it is. Meanwhile, there's The entire other aspect: how can corporations (chemical firms, for instance), shield their very own communications from terrorist snooping? Apart from superior encryption, how? I have never see this here heard any remedy. 3 Oct 2003
When I to start with checked out the program (yrs in the past, inside a Scientific American short article), I had been sceptical it was probable to undo the encryption and get the message back again. Nevertheless exactly the same vital, reversed, and put throughout the very same approach is all it will require to decrypt the information. (No trouble, due to the fact the computer does the perform.)
Bitcoin, the primary cryptocurrency at any time developed has indeed become the most generally utilised digital forex on the planet. Ever since the existence of Bitcoin in...
We could get hold of information regarding you by accessing cookies, sent by our Site. Differing types of cookies keep an eye on different actions. For example, session cookies are made use of only when somebody is actively navigating a web site. After you leave the web site, the session cookie disappears. | s3://commoncrawl/crawl-data/CC-MAIN-2019-04/segments/1547583857993.67/warc/CC-MAIN-20190122161345-20190122183345-00085.warc.gz | CC-MAIN-2019-04 | 4,497 | 14 |
https://progmiscon.org/concepts/operator/ | code | An operator is a symbol or keyword in source code that represents a built-in function.
Related concepts: Function
Closest Wikipedia entry: Operator (computer programming) — In computer programming, operators are constructs defined within programming languages which behave generally like functions, but which differ syntactically or semantically. Common simple examples include arithmetic (e.g. addition with +), comparison (e.g. "greater than" with >), and logical operations (e.g. AND, also written && in some languages). More involved examples include assignment (usually = or :=), field access in a record or object (usually .), and the scope resolution operator (often :: or .). | s3://commoncrawl/crawl-data/CC-MAIN-2023-50/segments/1700679100499.43/warc/CC-MAIN-20231203094028-20231203124028-00059.warc.gz | CC-MAIN-2023-50 | 685 | 3 |
https://forums.unrealengine.com/t/ai-patrol-blueprint/17480 | code | I am doing something similar and am also trying to set up the AI to be able to see and hear the player and then move to and attack. So far I have the AI moving to a random Target Point until it sees the player but it does not attack and I can’t figure out how to get it to hear the player, even though I have all that set up. With all that said, I hope that this will help you out some, I am using the AI character, AI controller (that I set up not the parent AI controller), behavior tree, and BTTasks.
The vairable GuardTarget is of type actor. You don’t need the other ones just yet.
Not sure if you have to do this in the controller but I was trying to get other things working with the behavior tree and there is more in my AI Controller that is not pictured. I suggest trying it without this one first and then add this if it doesn’t work after you do all the others.
You mainly want to focus on the Idle branch of this tree for what you are asking for.
This is the decorator for the first selector. This one tells the AI how far it is from the Target Points and the player. The tutorial that I followed on this was for the AI to stand guard and come at the player and chase him only a certain distance from his guard position.
This is the one you really want. This tells the AI to wander from each point. I am using the Target Point as the actor for the AI to go to in the game. This should make him pick a random point to go to. If you want him to follow a predetermined path then take out the Random Integer. Also, the variable WanderPointKey is of type Blackboard Key Selector.
Just so you know, I am not the greatest at programming so there may be much better ways to do this. I have pieced this together from several tutorials and just plugging away at it. That is how I hope to get the rest working also lol. Hope this helps, I know there is not much out there that is very clear to get people started with this new engine but a lot of it is just learning what you can and then testing out everything you can think of, you will break it most of the time but the reward is great when you figure it out. | s3://commoncrawl/crawl-data/CC-MAIN-2021-39/segments/1631780057622.15/warc/CC-MAIN-20210925112158-20210925142158-00652.warc.gz | CC-MAIN-2021-39 | 2,120 | 7 |
http://stackoverflow.com/questions/15522124/how-to-define-compilation-symbols-only-when-profiling | code | Create a new configuration:
- Click Build then select Configuration Manager.
- Click Active solution configuration and select New.
Profile in Name and select which configuration will be used as template (for profiling I guess
- Confirm with OK, it'll create new configurations named
Profile for each project in your solution.
- Open properties of each one of them and in the Build tab add
PROFILE symbol in Conditional compilation symbols then save properties.
Now when you build the
Profile configuration the
PROFILE symbol will be defined. I suggest you take a look to this post too, if you automate your builds you may keep your
PROFILE symbol out of the solution using
MSBuild properties via command-line (I suppose you do not build for profiling very often).
With configuration you can do that but it won't save you from a broken reference to
Microsoft.VisualStudio.Profiler.dll). What I suggest is to move all this code to another library that you'll ship to them compiled. There you'll expose just a method:
public static class ProfilingHelpers
public static void StartProfiling()
In your code you'll always call it but it'll be executed only when
PROFILE is defined (so you won't need to add a new configuration to each project but to one DLL only). | s3://commoncrawl/crawl-data/CC-MAIN-2014-10/segments/1394011044030/warc/CC-MAIN-20140305091724-00059-ip-10-183-142-35.ec2.internal.warc.gz | CC-MAIN-2014-10 | 1,257 | 19 |
https://blogs.perl.org/users/stevan_little/ | code | There is an old saying that "distance makes the heart grow fonder", and watching Matt Trout talking at YAPC::NA this year filled me with the mixed emotions embodied within that saying. I am simultaneously sad that I was not there to correct Matt's swiss cheese memory of past events and happy that I wasn't around to witness his sad excuse for a beard. But alas, this blog post is about much more then Matt's physiology.
Nothing makes me miss all the wonderful folks in the Perl community as not being able to get to YAPC::NA. So while I have not yet watched all of the videos, I did watch the 6 (count them, six!!!) keynotes, and I wanted to just post about my impressions of them.
I recently posted a quick update on the p5-mop project the other day, something I have been meaning to do for a long time. I am sure given the slow and often rocky progress of this project that many people have their doubts if will ever see the light of day, and to be quite honest, some days I found myself doubting it as well.
I really need to expand on this a little more, but for now this can serve as a status update on p5-mop.
so, is -redux now "the last prototype" and p5-mop-XS "the current hopefully-not-just-a-prototype" ?
so here is the official line
was the first (overly ambitious) prototype in which I ignored Perl and went my own way
was my (somewhat child…
As I was doing my daily check of blogs.perl.org, I noticed a recent post by Ovid in which he said ...
Of course, we also need the p5-mop, but that hasn't been touched in a while; I hope it's not dead.
... and that reminded me I really needed to write a blog post to update folks on the status of the p5-mop effort, so here goes.
I have been meaning to write this post for a while, but between $work and getting sick I have not really had the brain capacity to do it. Being that I am still a little sick and $work is still busy, I am making no promises about the quality of this post.
In a previous blog post I mentioned that I was going to port my module Bread::Board to p5-mop which I managed to finish up about three weeks ago and then promised to write a post about my experience. Well better late then never, here is that post.
NOTE: if all the Bread::Board terminology gets tedious, skip down to the TL;DR at the end to read my conclusions. | s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764494852.95/warc/CC-MAIN-20230127001911-20230127031911-00781.warc.gz | CC-MAIN-2023-06 | 2,303 | 14 |
https://www.raptorforum.com/threads/need-rubber-boots-for-my-fcr-billet-adapter-from-coal-shed-racing.85237/ | code | i have no idea what brand this thing is, or who made it.....but i got it back when "coal shed racing" was still porting heads. all i need is the rubber boots because mine is cracked on one side. i do not want the billet part because mine is ported to match the head porting. any idea where i can get one....?? | s3://commoncrawl/crawl-data/CC-MAIN-2021-49/segments/1637964363301.3/warc/CC-MAIN-20211206133552-20211206163552-00608.warc.gz | CC-MAIN-2021-49 | 309 | 1 |
https://www.bukkit.org/threads/help-optimizing-jvm-settings.21767/#post-389353 | code | Okay so I am running a server off of a cheap linux box I threw together. When there are only a 2 or 3 players on the server everything is perfect. Once I hit about 6 players it starts to get laggy. When it starts to get laggy, I notice that the RAM usage is around 6 or 7 Gigs. I am running the java JVM with some options but I do not know if any of the options are hurting my server more than helping or if I need other options to increase performance. System Specs: -OS: Ubuntu 11.04 -Ram: 8GB (System reads it as 7.6GB) -Processor: AMD Athlon II X2 250 Processor -Hard Drive: 74GB Western Digital Raptor 10,000 RPM (I had one lying around) Current start options: -server -Xmx7000M (would setting this to lower than almost full system help or restrict how much the server can keep in memory) -Xms7000M -XX:+UseConcMarkSweepGC -XX:+CMSIncrementalPacing -XXarallelGCThreads=2 -XX:+AggressiveOpts Possible Hardware Improvements I might be upgrading my desktop to solid state and moving two 150GB VelociRaptor 10,000 rpm harddrives in raid 0 to the server. Would that help a more? I might also outright buy an SSD for the server if it comes down to a Harddrive bottleneck. I know this is long but for anyone that can help me I would really appreciate it. | s3://commoncrawl/crawl-data/CC-MAIN-2022-27/segments/1656103984681.57/warc/CC-MAIN-20220702040603-20220702070603-00123.warc.gz | CC-MAIN-2022-27 | 1,252 | 1 |
https://gitter.im/FreeCodeCamp/HelpFrontEnd?at=5ade19235d7286b43a5c4d6b | code | Sure, plus you have link to a bank account or user accounts or anything like that, but I just wanted to share why it can be dangerous.
Hey guys, can you hell me with such question :
Where it's better to start (at Freecodecamp or at a beta freecodecamp?)
IMHO it also makes the calculator much easier. once you have it done and are happy with your results it's a good challenge to build your own parser to process the input. You will learn a lot and if a potential employer looks at your code won't ask why are using eval but will be wowed that you built your own parser
@MaximKazionov I think with the beta you get the advantage of so many years of experience and refinement, I also hear it is much shorter and more to the skills you need. But there are no certifications yet, at least last I looked
@NJM8 Thank you so much!
maximkazionov sends brownie points to @njm8 :sparkles: :thumbsup: :sparkles: | s3://commoncrawl/crawl-data/CC-MAIN-2019-35/segments/1566027316555.4/warc/CC-MAIN-20190822000659-20190822022659-00527.warc.gz | CC-MAIN-2019-35 | 901 | 7 |
https://padakuu.com/blockchain-merkle-tree-1137-article | code | Blockchain - Merkle Tree
Since every transaction in a block is hashed in a Merkle Tree, the issue of disk space in a node is easily overcome.
Block headers now contain a hash of the previous block, a Nonce, and a Merkle Tree containing the Root Hash of all transactions within the current block. In order to save disk space, the Root Hash includes the hashes of all transactions in the block, which can then be pruned to save space. The blockchain now looks like the image below.
As a result, a normal client who just wants to receive payments from others can save a lot of disk space by using this strategy. Miners, however, must keep the entire blockchain in order to verify the payment. Next, we explain how the receiver verifies the payment without being able to trace the coin back to its source. | s3://commoncrawl/crawl-data/CC-MAIN-2024-18/segments/1712296816939.51/warc/CC-MAIN-20240415014252-20240415044252-00763.warc.gz | CC-MAIN-2024-18 | 801 | 4 |
http://machinedesign.com/print/controllers/programmable-logic-controllers-0 | code | The Vision130 family now includes flat-fascia Vision130 units, which are suitable for the food and pharmaceutical industries. The units are IP66/IP65/NEMA-4X rated, and withstand spray/wipe-down applications. Models include: V130-J-B1, V130-J-TR20, V130-J-R34, V130-J-TR34, V130-J-TR6, V130-J-RA22, V130-J-TRA22, V130-J-T2, V130-J-T38, and V130-J-TA24.
The units support up to 256 I/O and include digital and analog inputs and outputs, high-speed I/O, PT100/thermocouple capabilities, and load cell for weight measurement. The I/O can mount locally or remotely, up to 1,000 m from the controller. The power-PLCs store recipes, and use Micro SD card memory for data logging, backup, and PLC cloning.
The LCD, 2.4-in., 128 × 64 pixel control panel supports over 1,024 user-designed screens with 400 images/application, a 20-key keypad, and can display trend graphs and a text string library. Internal memory holds 512K of application logic, plus 128K for fonts and 256K for images.
Unitronics Inc., 1 Batterymarch Park, Quincy, MA 02169, (866) 666-6033, www.unitronics.com | s3://commoncrawl/crawl-data/CC-MAIN-2017-09/segments/1487501173866.98/warc/CC-MAIN-20170219104613-00251-ip-10-171-10-108.ec2.internal.warc.gz | CC-MAIN-2017-09 | 1,071 | 4 |
https://www.androidblip.com/android-apps/com.deviceidmanager.html | code | About Device ID Manager
This Device ID Manager will find out your device id (serial number) to share with developers.
This is a very useful tool for Gear VR developers and end users.
Device ID Manager displays your device id, which is required for signature file creation (OSIG).
Create a signature file or learn more about how to use it below:
The App copies your Device ID to your clipboard and activates the share function e.g. email etc.
Note: You DON'T need to root your device. | s3://commoncrawl/crawl-data/CC-MAIN-2023-40/segments/1695233506399.24/warc/CC-MAIN-20230922102329-20230922132329-00398.warc.gz | CC-MAIN-2023-40 | 483 | 7 |
https://ez.analog.com/amplifiers/instrumentation-amplifiers/f/q-a/114473/ad8224-low-frequency-noise-issue?pifragment-7659=2 | code | In one of our designs we are doing signal conditioning of a load cell (Wheatstone bridge)
We have an issue where the main instrumentation amplifier an AD8224 has an output signal that wanders by around 20mV over the course of 10's of seconds.
We have gone through troubleshooting the signal chain and tracked the issue back to here, the above plot is taken with the inputs to the amplifier shorted out and connected to 0V.
Supply to the op-amp is +-15V, and has 0.1uF bypass directly near the IC, and 10uF about 3cm away. We are using a gain resistor of 49.9ohm which should give a gain of around 991.
We are not worried about the high frequency noise, this gets tidied up latter in the chain.
I was able to find one of our old board revisions that does not show the same issue, these boards are modular, and I was able to slot in the new and old next to each other to compare the results. The old board used a TI INA2126 instead of the AD8224.
Channel 0 red is the old revision and shows what I would expect of changes due to thermal effects.
Channel 1 yellow shows the same board as per above.
In terms of ruling out other factors, this is in a room without much airflow, breathing on the board doesn't seem to have a large effect, nor does insulating it from airflow by wrapping it in rags. My reference pins are tied directly to zero volts. I have confirmed I get a nice clean (1-2 bits of noise) signal recorded if I short the output of the amplfier to 0V (as expected).
I am all out of ideas, is this expected performance of the AD8224 (from the datasheet I think I should be getting at least an order of magnitude better).
I am certainly not an expert, and haven't had to do much careful temperature control on a project before, besides dealing with power dissipation and temperature co-efficients of shunt resistors.
I am just trying to figure out what I can do to try and replicate the plot found in the datasheet "Figure 10. 0.1 Hz to 10 Hz RTI Voltage Noise (G = 1000)" I would have expected to be able to get something in the same ballpark but since I am measuring at the output with the scale in mV because of the 1000 gain.
I have been able to replicate the results across at least 5 boards, and at least one of the boards was assembled with ICs from a different shipment.
Did the layout look reasonable?
I've asked around for some help with your issue.
Your layout looks ok to me. I'm assuming the traces that run off the picture to the right (AD8224 pins 6, and 7 for REF1 and REF2) go down to via ground close-by. With C1-C6 replaced with shorts to ground as you have it, I'd imagine it'd not be necessary to cut the traces to +IN and -IN at pins 1, 4, 9, and 12 of AD8224. However, if I were you I'd try cutting those traces just to make sure nothing is coupling in from the these connections (although it'd be hard to believe that anything would if you've already grounded these - but just in case).
I'll keep you posted if something comes up whilst discussing with colleagues.
For the level of precision you are trying to achieve, I wouldn't even consider a two layer board.
Four minimum. I would also not use a dual. Interaction, cross talk, layout problems, etc.
For figure 10, they probably had a low pass filter with a corner of 10-20 Hz.
I'm assuming your scale signal is not that fast, so I would consider slower, autozero InAmps.
LT2053, AD8421, AD8237, LT1167
or, to solve the whole problem and reduce system cost, look at the ADA4558.
Any solder joint has a tempco of 5-35uV/C, so balanced layout is extremely important.
My colleague has used an EVAL board to see if we can duplicate your issue? We configured the first (Dual) site so that one of the in-amps has a gain of 991, using +/-15V supplies, starred the inputs to the Ref gnd, and then measured at the output of the channel with the high gain. The attached scope photo shows the noise over ~400s. At this scale (10mV/div) no "slow" wander is visible to me, unlike the scope photos you'd attached.
We did notice that the measurement was very susceptible to external noise; moving a cell phone near the board on the lab bench would result in large noise spikes seen on the scope. So, perhaps some shielding would help you.
Please take a look at Harry's comments above as well. | s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703557462.87/warc/CC-MAIN-20210124204052-20210124234052-00775.warc.gz | CC-MAIN-2021-04 | 4,262 | 27 |
https://www.clickonf5.org/tutorial/howto-dual-boot-option-pc-windows/8723 | code | One can have dual boot option on PC to boot multiple operating system on single computer. For example, you can have Windows XP, Linux, Vista, Windows 7 etc on single PC by installing them in different partitions. But to have that dual configuration, you will have to install the latest version after the earlier ones. For example, you can have dual boot option if you have installed Windows 7 in different partition on a PC with already installed XP. But if you will install XP after Windows 7, boot screen will not see any option to start Windows 7. But there may be a situation where you need to install Windows XP on your pre-installed Windows 7 PC.
Few days ago, I purchased a Tata Photon Plus mobile Internet data card which was suppose to run smoothly on Windows 7 but that didn’t happen. The device started facing some issue with the Windows 7 64 bit driver. Then I thought to have Windows XP in another partition so that I can still run Tata Photon Plus over there. But as I said, if you install the earlier version of Windows after the later one, you will not get dual boot option as the boot configuration file will be overwritten by newly installed windows file which is not the updated one. But I installed Windows XP on a pre-installed Windows 7 PC and made some changes to get the dual boot as well. Here is the way to get that.
So I installed Windows XP in different partition of my computer hard disk and then lost Windows 7 boot option. After the successful installation of Windows XP, computer started properly but with only XP. To get the dual boot option and get back Windows 7 boot option, you will have to follow below steps,
Steps to get dual boot on PC even Windows XP was installed after Windows 7
1. Start your computer (It will start with Windows XP)
2. Download Microsoft .net framework and install the same
3. Download and Install EasyBCD on your current Windows
4. Now use Add/Edit Entry to add Windows XP in the list. To do that, Select Windows NT/2K/XP in the type dropdown and type Windows XP in the name section
5. You may set the drive where XP is installed
6 Finally, you will have to “Write MBR” which is available under “Bootloader Setup” tab in the left navigation. select “Install the windows vista/7 bootloader to the MBR”
Done. Now restart your computer and there you can see dual boot option for Windows 7 and Microsoft Windows XP. You can use EasyBCD for multiple boot options as well. For example, to have Ubuntu, Windows XP, Windows Vista and Windows 7 on a single PC. | s3://commoncrawl/crawl-data/CC-MAIN-2024-10/segments/1707947474641.34/warc/CC-MAIN-20240225171204-20240225201204-00477.warc.gz | CC-MAIN-2024-10 | 2,528 | 11 |
https://www.intelletec.com/jobs/machine-learning-lead | code | A fast series growing Series A startup is looking for a Machine Learning Lead who can help the team to dig into a variety of textual data. The company is backed by top-tier VCs and angel investors.
You will be a founding member of Machine Learning team and this role will have an outsized impact on future of the company's platform. You will develop Machine Learning models for the platform and run them at scale to automatically tag incoming partner data. This will involve essentially turn terabytes of incoming and proprietary local data into insights, to understand how cities really work. It's a chance to have your fingerprints all over the product, work with and help build a world-class team working one of the largest and most challenging problems - local.
- Leveraging a large amount of textual data in various forms, build and train natural language understanding systems.
- Develop new algorithms and modeling techniques, conduct experiments to prove these new techniques and integrate them into the live production system for tagging, text classification, sentiment analysis, named entity recognition etc.
- Keep up with recent advances in natural language processing (NLP), machine-learning (ML) and big data processing.
- Work closely with other team members on the development and support of new products.
- Run online experiments and develop metrics that can drive product requirements.
- MS in CS and Natural Language Processing (NLP), Machine learning (ML). PhD is big plus.
- Strong expertise in Natural Language Processing (parsing, entity recognition and detection, text classification etc.) and language modeling at scale using neural networks and classical techniques.
- Experience with commonly available tools and infrastructures for natural language processing, text mining, machine learning and parallel data processing.
- Experience in large scale software development and product development process.
- Knowledge of at least one modern programming language (C++, Java, Scala) and scripting language (Python, Perl).
- Experience working with large amounts of user-generated content and data processing in large-scale environments using Amazon EC2, Storm, Hadoop and Spark.
- 3+ years of professional experience in the industry.
- Curiosity about the local content and local news space.
- Being an active participant in meet-ups/groups related to: Data Analytics, Hadoop, Cloud Computing, Data Visualization, Data Mining, MapReduce, Machine Learning, High Scalability Computing, Predictive Analytics | s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320304876.16/warc/CC-MAIN-20220125220353-20220126010353-00158.warc.gz | CC-MAIN-2022-05 | 2,527 | 16 |
https://gamecrawl.com/play/sGQ5j10KVqc | code | Yu-Gi-Oh! YCS Pasadena Nov2022 - ROUND 1 : Labyrinth Musketeer vs Zombie
2022-11-05 submitted by
Screen recorded and edited by LaunchOfDueling. Thanks for watching this video, please subscribe for more things Yu-Gi-Oh! LaunchOfDueling Social Networks: INSTAGRAM:
Watch full event and original streaming at Official Yu-Gi-Oh! TRADING CARD GAME:
遊戯王, Yu-Gi-Oh!, Yu-Gi-Oh, yugioh, yugioh!
Add A Comment To This Gameplay
From The Same Player
All Time Most Voted | s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764501066.53/warc/CC-MAIN-20230209014102-20230209044102-00314.warc.gz | CC-MAIN-2023-06 | 463 | 8 |
https://www.blackhatworld.com/seo/wanted-designer-3d-art-and-graphics-person.166513/ | code | I am looking for a 3D art, graphics and designing partner for some of my long term WhiteHat projects. I already have people working with me but I have more work than my team can handle. Now, I do not have any time to waste with people who aren't close to outstanding (let alone good). If anyone is interested and thinks he/she is good enough, please PM me and I'll send you my email id to send your portfolios. I would prefer someone who has the talents and possibly experience in a lot of varied designing fields. If you're serious about hitting me up for this, ask yourself if you have the capacity to design amazing web templates, email templates, brochure templates, logos, business cards, any custom graphics (2D or 3D), maybe even graphics for some online game (yes, I'm really looking for an advanced designer), GUI for softwares, whatever else that you can possibly encompass under 'Design, Art and Graphics'. I need to know that you're serious about this venture and I don't know how to ensure this. I've had people come to me, waste an hour and a half of my time listening to me explain some good details about a big project and then disappear. I dont want that to happen again. I dont know, maybe even designing a few web templates for a site for a start or I don't know. But before we even decide to work together, I need to see some good samples and references of your work. PM me and lets get going! Thanks for reading through this essay of a post (written boringly with no change of fancy fonts or font size). Hope to hear from some nice people and hope to do some good work with ya! | s3://commoncrawl/crawl-data/CC-MAIN-2018-47/segments/1542039744561.78/warc/CC-MAIN-20181118180446-20181118202446-00441.warc.gz | CC-MAIN-2018-47 | 1,598 | 1 |
https://ux.stackexchange.com/questions/70412/ios-lock-screen-has-letters | code | If you're in the US, why don't you call Apple and ask them.
Apple have a dedicated iPhone helpline.
The number is 1-800-694-7466
Or if you find it easier to remember, it's 1-800-MY-IPHONE
EDIT It seems I'm just too funny for words, so here's a more direct and unfunny answer:
Some people may use a letter mapping of the numbers. They might do this because it's easier to remember - people of all ages and abilities use smartphones! Or they might do it because that's how they did it on a previous phone, or because they've use the same system on another device or product - Android and others also have the same number/letter mapping. They might use the same passcode/word on their TV set top box - it doesn't matter why - the option is just there.
People may like to keep things simple to use so would rather opt to use a 4 digit pin rather than a password with numbers and letters.
A proper password needs to have a keyboard entry which is significantly more fiddly than the 10 button input above. People who want the security but who have difficulty using the keyboard really don't want to have to be forced to use the keyboard every time they wake the phone or when they need to enter the passcode during other protected tasks.
Note that you can opt to use a longer passcode but have the same 10 key entry system so long as you only use numbers in the code. For longer numbers it's even more useful to have the letter mapping to make it easier to recall.
So - the options are there for different people to make choices and use the system in a way that works best for them - whatever that is, and without compromising the usability for those who just enter 4 digits without caring about the letters.
It's no good saying 'Well I don't use it' or 'That wouldn't work for me'. The point is that the design is as inclusive as possible in order to cater for everyone to use their preferred method.
Look at it another way: By removing the letters, Apple would be effectively directly targeting those people who prefer to map their simple passcode onto letters, and making their life more difficult. And what would be the point of that? | s3://commoncrawl/crawl-data/CC-MAIN-2023-50/segments/1700679100686.78/warc/CC-MAIN-20231207185656-20231207215656-00082.warc.gz | CC-MAIN-2023-50 | 2,132 | 12 |
http://www.tiagovieira.pt/blog/2016/11/15/sentiment-analysis-pt-pt-final-thoughts-tips/ | code | In this series of posts we set out to create a machine learning model that would correctly identify positive tweets from negative tweets in Portuguese (from Portugal). While we achieved a good results (70+% accuracy) there are some things that I would like to cover.
The first and probably most important thing is the fact that we did not use “neutral tweets”. While some (if not most) papers simply ignore tweets with neutral content I find that a sentiment analysis model that can only separate positive from negative tweets is lacking. In fact, in the startup I mentioned we had to consider neutral tweets so, here are a couple of things we tried:
- Creating a model like we did but adding a threshold interval where tweets we are not 100% sure are either negative or positive would be neutrals. This turned out not to work because our models separated quite clearly both classes.
- Considering every tweet without emoticons a neutral tweet. We had labeled tweets and noticed that about 70% had neutral content so the idea is that this would not impact our performance that much. It did.
- Adding neutral text from other sources. We experimented with adding sentences from different review sites like zomato, yelp, etc to create a neutral dataset. This actually worked quite well.
- Analyzing the twitter content and infer neutral tweets based on POS tags. This was done here and it seems like a promising idea but it would be a different project altogether.
The second point I want to talk about was that the training dataset we used was quite small. Twitter has a rate limit for how much information you can get from it and the Portuguese community on twitter is not that big (at least for residents on Portugal). There is also the fact that, in this notebook, we only played arround with default values (both for word2vec, pca and models). We would probably be able to improve performance if we played around a bit more. Still, the point of the these posts was to give an overview on how this kind of thing is/was done.
Lastly, some thoughts about working on this project. While we started with this idea of creating a perfect sentiment analysis engine, it became clear that rather than worrying about identifying positive and neutral tweets, what we should have actually been trying to do was identifying negative tweets. On the context of the whole project, negative sentences bring much more information about your business than positive or neutral ones (you can learn much more from criticism than from praise).
There are also some cool things we started doing with word2vec and relationships between promotions and companies but I’ll leave that to another post.
Hope you have enjoyed reading about this small adventure, let me know if you want to hear more about it! | s3://commoncrawl/crawl-data/CC-MAIN-2023-40/segments/1695233506423.70/warc/CC-MAIN-20230922202444-20230922232444-00812.warc.gz | CC-MAIN-2023-40 | 2,783 | 10 |
http://vcad.ir/faq-how-do-i-find-the-source-url-for-an-i-drop-block/ | code | FAQ: How do I find the source URL for an i-drop block?
Do one of the following:
- Mouse over a block that was added using i-drop. The URL is displayed in a tooltip.
- Navigate to and open <file name_idrop.txt. All blocks added to the drawing using i-drop, and their sources, are listed.
sp>Do one of the following: Mouse over a block that was added using i-drop. The URL is displayed in a tooltip. Navigate to and open _idrop.txt. All blocks added to …Do one of the following: Mouse over a block that was added using i-drop. The URL is displayed in a tooltip. Navigate to and open <file name>_idrop.txt.… If I block Google from crawling a page using a robots.txt disallow … How can I check the X-Robots-Tag for a URL? … a page is likely to decrease that page's ranking or cause it to drop out …. Official source of webmaster news.Block s. To learn how to add alt text to Gallery Block images, visit …search for the Tampermonkey entry and press the trash icon. … item, create a file with the file extensions .tamper.js and drag-and-drop it to Chrome. … This source can be any URL that is accessible by both of your Chrome browsers.URL.do I get paid when stuff sells in my store? That all depends on if you' re … I don't remember my Shop URL, how can I access my admin? No worries!find any Java systems. A JDK … How can I pass arguments to the Java Virtual Machine which BlueJ runs on? …. You will see a field labelled "JDK documentation URL". …. The first approach is to block traffic based solely on the source/destination IP address ( and/or port …How can I contribute; I'm not a developer, can I still contribute; Why do I need to sign a … If you are a developer, you'll find lots of answers herein that have been … As an open source project, you are welcome to fork BigBlueButton and build …… You see a volume indicator next to the drop down for choosing a microphone.How can I determine whether RoboForm Pro is activated? … RF Desktop: How do I customize (get rid of) RoboForm Context Menu Items? ….. You may have created a blocking Passcard for this web site. … site, click drop down arrow of No button in Save Forms dialog and select the Do not AutoSave at This site menu item. | s3://commoncrawl/crawl-data/CC-MAIN-2023-23/segments/1685224643462.13/warc/CC-MAIN-20230528015553-20230528045553-00569.warc.gz | CC-MAIN-2023-23 | 2,223 | 5 |
https://www.thedebugstore.com/gps-active-embedded-antenna-mikroe-3375.html | code | GPS Active Embedded Antenna
This GPS Active Embedded Antenna is an excellent choice for all GPS/GNSS Click boards™ from our offer. With its high gain, and active band filtering, it is a perfect choice when reinforced positioning is required. It can be mounted on the PCB, directly. The GPS/GNSS module features a small cable with 10cm in length, allowing it to be positioned away from a small IPEX connector.
- GPS Active Embedded Antenna (1)
If you've already bought this product, please sign in and add your own review! | s3://commoncrawl/crawl-data/CC-MAIN-2019-43/segments/1570986677964.40/warc/CC-MAIN-20191018055014-20191018082514-00196.warc.gz | CC-MAIN-2019-43 | 523 | 4 |
http://e-priroda.rs.ba/en/endemics/details/14368/ | code | Original taxon name: Procrustes excavatus.
Author of the original taxon name: Charpentier, T.de,
Classic locality: montibus Pyrenaeis [error]
Reference where the scientific name of taxon was first described in: Charpentier, T.de, (1825). Horae entomologicae, adjectis tabulis novem coloratis. A. Gosohorsky. Wratislaviae: 55 + pp., 9 pls pp.
Economy where the taxon was described in: ?
Specific description of the place: Pyrenees[?]
Global distribution of taxon: ALB,BIH,HRV,MKD
Reference where the scientific name of taxon was accepted in: Löbl, I. & Löbl, D. (eds.) (2017). Catalogue of Palaearctic Coleoptera Volume 1 Revised and Updated Edition. Archostemata - Myxophaga - Adephaga. Brill. Leiden-Boston: 1443 pp pp. | s3://commoncrawl/crawl-data/CC-MAIN-2023-50/segments/1700679100593.71/warc/CC-MAIN-20231206095331-20231206125331-00806.warc.gz | CC-MAIN-2023-50 | 723 | 8 |
https://www.lordoflifemillwoods.com/covid-policy | code | Our Safety Protocols for Worship
We meet "in-person", and when we do, we follow these measures:
Our "in-person" worship features:
6-foot spacing of pews
no physical contact
We commit to following guidelines set for us by Alberta Health. | s3://commoncrawl/crawl-data/CC-MAIN-2023-40/segments/1695233511000.99/warc/CC-MAIN-20231002132844-20231002162844-00051.warc.gz | CC-MAIN-2023-40 | 236 | 6 |
https://tf2maps.net/downloads/noodleswords.8636/ | code | My first attempt to create a fun and working TF2 map.
Hey all! I recently started learning how to use hammer about a week ago, and decided to give a go at creating a map. I know it's fairly small, but this is my first project. Please give me any feedback, I want to make somethin great one day!
- Steam Workshop Link | s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296948609.41/warc/CC-MAIN-20230327060940-20230327090940-00770.warc.gz | CC-MAIN-2023-14 | 316 | 3 |
https://github.com/easymock/easymock/issues/78 | code | Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and
privacy statement. We’ll occasionally send you account related emails.
Already on GitHub?
to your account
Migrated from: CodeHaus issue EASYMOCK-67
Original reporter: Henri Tremblay
Easymock and easymock-ce MANIFEST.MF files currently contain no OSGi headers. Both jar's should be made OSGi-ready.
The text was updated successfully, but these errors were encountered:
No branches or pull requests | s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710237.57/warc/CC-MAIN-20221127105736-20221127135736-00109.warc.gz | CC-MAIN-2022-49 | 608 | 10 |
https://www.brc.ac.uk/biblio/recording-behaviour-field-based-citizen-scientists-and-its-impact-biodiversity-trend | code | The recording behaviour of field-based citizen scientists and its impact on biodiversity trend analysis
Summary Opportunistic species sightings submitted by citizen science volunteers are a valuable source of species data for trends analysis, as used in biodiversity indicators. However, projects collecting these data give people flexibility where and when to make records, and the recording behaviour of participants varies between individuals. Here we tested the effect of recorder behaviour on outputs of the analysis of temporal biodiversity trends. Using a large (c. 3 million records), 20 year unstructured citizen science dataset of butterfly records in Great Britain, we manipulated recorder behaviour by constructing biased 50% subsamples of the dataset by preferentially including different types of recorders (based on high and low values of four metrics independently describing the temporal, spatial and taxonomic attributes of recorder behaviour). We found that, in general, the three outputs (namely: occupancy trend, precision of the trend, and the estimate of occupancy) showed relatively little deviation from random expectation across most of the different types of recorder behaviour. Occupancy trends showed least deviation, while estimates of occupancy itself showed greatest deviation from the random expectation. Regarding the recorder behaviours, the outputs were most sensitive to variation in ‘recorder potential’, which describes the difference between ‘thorough’ and ‘incidental’ recorders. Importantly, by demonstrating the robustness of occupancy trends to differences in recorder behaviour, this study provides support for the appropriate use of occupancy trend modelling for unstructured citizen science. However, we did not consider change in recorder behaviour over time, so further research is required to assess the impact of this on trend modelling. This study highlights the value of developing solutions to further increase the robustness of biodiversity trend analysis. These solutions should include both analytical developments and enhancements in project design to engage participants.
|Year of Publication
|Number of Pages | s3://commoncrawl/crawl-data/CC-MAIN-2024-10/segments/1707947474482.98/warc/CC-MAIN-20240224012912-20240224042912-00262.warc.gz | CC-MAIN-2024-10 | 2,180 | 4 |
https://www.br.freelancer.com/u/saneera223 | code | I am a dedicated and hardworking full-stack developer who really wants to achieve my client satisfaction. Over the last 12 years, I have been in challenging and interesting projects which have sharpened my knowledge about web technologies.
** Areas of expertise**
Microsoft .NET stack: ASP.NET/MVC using C#, SQL Server (experience - over 10 years)
Cloud stack: Web API, Azure AD, AWS (experience – about 2 years)
Web markup: HTML/HTML5, CSS/CSS3, LESS, PSD to HTML (mobile responsive) (experience – about 5 years)
Payment Gateways: PayPal, Stripe Integration (experience – about 2 years)
Why should I be hired?
**Client satisfaction is my utmost priority**
- quality work within tight deadlines.
- Working flexibility on client demand.
- 24 * 7 available to communicate over chat, voice calls
- Technical support even after job completion. | s3://commoncrawl/crawl-data/CC-MAIN-2023-23/segments/1685224646457.49/warc/CC-MAIN-20230531090221-20230531120221-00128.warc.gz | CC-MAIN-2023-23 | 845 | 12 |
http://luminoscerebrae.blogspot.com/2009/05/new-desktop-pc-memorial-day-2009.html | code | Update 6/3/2009: I was able to install Windows 7 RC (x64). Turns out that the Windows 7 installer has difficulty with the IDE/SATA adapter I was using to connect the DVD drive to a SATA channel. Removed the adapter, connected the DVD drive directly to the IDE channel on the mobo, and peace was restored to the land.
What better way to honor our military personnel than to go shopping, and shopping I did, with geek style.
Motherboard: Gigabyte EX58-UD3R (rev 1.0)
Video: Diamond Radeon HD 4870 (OEM version from Fry's)
HD (OS): Patriot SSD SATA2 32GB (my first solid state drive)
HD (Data): Western Digital Caviar Green 500GB
OS: Windows XP 32-bit Windows 7 64-bit RC
This was my first Gigabyte mobo. My previous mobo was an Abit IC7-G and I had very good experience with it, but since Abit is no longer a player I needed to pick a new vendor. The EX58-UD3R had good user reviews and seems to be well-made, but the BIOS update utility failed to retrieve an update from any of the 5 choice of server. It did, however, successfully update once I downloaded the BIOS update file manually. There are other nifty utilities for overclocking and monitoring, all of which seem to be working correctly. There's even a utility that allows you to upload a custom image for the POST boot screen!
I ordered the mobo, CPU, and RAM from MWave as a bundle, so these parts were assembled/tested before I received them. No thermal compound fun this time. :-(
The video card, SSD drive, and case were purchased locally at Fry's Electronics. I had a good sales person helping me who managed to find the smaller 32GB drive and an OEM version of the video card. I wanted to foray into the new SSD technology, but the drives are still relatively expensive. Given that I wanted a fast drive for the OS and that it didn't need to be very large, I opted for the smallest SSD drive I could get. Still, it set me back $120 for 32GB.
The Antec case is a good model from a quality manufacturer. I refuse to buy cheap crappy parts, and this includes the case. The one I picked at the store, however, was a return that had not been marked, and was missing parts. I went ahead and finished the build, knowing that I'd be disassembling the whole thing the next day to exchange the case. The case came with 3 120mm fans, which I need since I run a GIMPS client. I had fun assembling the machine and totally geeked out on the cabling.
I have a copy of Windows 7 RC, which I've installed on my other 3 year old machine and it worked like a champ (IMHO, Windows 7 is like Vista, only not crappy). I had plans to do the same on the new rig, but Win7 would not install. The previous installation on my old hardware took ~15 mins start-to-finish. However, after over an hour initializing the installation, I figured it wasn't going to work for some reason, and quit. I tried a few more times, disabling as much hardware as I could, but to no avail. Disappointing. I do hope that whatever the issue, it gets resolved for the final release because I don't want to be limited to XP for the next couple of years (and Vista is out of the question).
The machine is together now and I'm getting all my software installed (again!). GIMPS is chugging away, increasing both the knowledge of mankind and the temperature in this room. World of Warcraft runs like a champ now, too (Ultra mode, baby!).
I, myself, am a veteran of the U.S. Army, having served with the 568th Eng. Co. from 1992-1996. Wild and Ready, Sir! Hoooah! | s3://commoncrawl/crawl-data/CC-MAIN-2017-17/segments/1492917121865.67/warc/CC-MAIN-20170423031201-00013-ip-10-145-167-34.ec2.internal.warc.gz | CC-MAIN-2017-17 | 3,474 | 14 |
http://zooart24.ru/payment-delivery/ | code | - Косметика для кошек
- Косметика для собак
- Одежда для собак
- Оплата и Доставка
The following Documentation will help you better understand theme settings and configuration options. Please note: this documentation is related to the theme itself. If you need more information on working with Shopify please refer to Shopify official documentation.
- Responsive design - theme works fine on all resolutions starting from 320px.
- Bootstrap framework - theme layout and styles are based on popular CSS framework Bootstrap.
- Theme settings - theme is provided with various options that allow you to tune theme according to your needs.
- Extended color options - multiple color options selectors included so you can change theme colors the way you want.
- Dropdown menu - allows to display extra content in main navigation.
- MailChimp newsletter - theme has an integrated MailChimp newsletter system
- Customers account - theme allows customer registration and accounts.
- Product image zoom - view full product image in pop-up.
- Contact form - theme contains fully functional contacts page with Google map and contact form.
- Currency switcher - with single click allows to display product prices in desired currency.
- Customizable slider - with theme settings you can edit home page slider content.
- Payment methods widget - allows to display logos of the payment methods available for your store. | s3://commoncrawl/crawl-data/CC-MAIN-2019-09/segments/1550247489729.11/warc/CC-MAIN-20190219081639-20190219103639-00218.warc.gz | CC-MAIN-2019-09 | 1,465 | 17 |
http://kiannaskorner.blogspot.com/2012/10/organized-start.html | code | I'll admit it: Getting and staying organized is hard sometimes. Pretty calendars and neat lists sound exciting and those alone keep me going for a few days but then what?
I've made a three-columned list that helps me take organizing one day at a time.
I laminated my list so that each day, I can write down my tasks (in dry erase marker) and erase them the next day and start all over again.
How do I categorize and prioritize between the three columns?
The must do column is pretty self-explanatory. Here I write down the tasks that I must complete for the day such as homeschooling, Bible reading and such.
The should do column is comprised of the things that, if I have extra time, I should do (to get a head start on the next day's must do list.) It is not mandatory to do the things on this list.
The can do list is a free time list. If I complete my must do tasks, I can indulge in using my iPod or some other extra privilege.
So far, this method has helped me manage my time better. Before, the day would go by and I didn't have time to fully complete my important tasks with excellence. Now, I prioritize and everything gets done.
How do your stay organized? What method(s) help you? I'll take all the help I can get:) | s3://commoncrawl/crawl-data/CC-MAIN-2017-17/segments/1492917120338.97/warc/CC-MAIN-20170423031200-00190-ip-10-145-167-34.ec2.internal.warc.gz | CC-MAIN-2017-17 | 1,226 | 9 |
https://forum.aspose.com/t/cloneslide-get-wrong-slice-after-changing-order/100401 | code | I am evaluating your product Aspose.Slides on java. I wrote a small code which pulls out a slice from an existing ppt on stores this on a new ppt.
When I change the order on the source ppt I got the wrong Slide on my destination ppt.
To make an example
1Step Original PPT Look like
-> Programm clones the 2nd and suprise I get slide 2, works.
2Step I changed it and now original PPT looks like
-> Programm runs again and again I want the 2nd one and I get slide 2, but because changing the position with slice 3 I should get slice 3
I hope I could explain clear my problem and Iám looking forward to your answer.
pres.getSlideByPosition(i) // positions numerated from 1 | s3://commoncrawl/crawl-data/CC-MAIN-2022-27/segments/1656103328647.18/warc/CC-MAIN-20220627043200-20220627073200-00480.warc.gz | CC-MAIN-2022-27 | 670 | 9 |
http://eolake.blogspot.com/2009/08/spotify-online-radio-service.html | code | Through The Lens pointed to Spotify, which is a an online "radio" service which seems to work even better than Pandora. You can choose music and sequence, and it does not seem to be limited to some countries unlike Pandora (last I checked). I wonder how they got around all those legal barriers? Or if they really have. For a while there I could use Pandora in UK, but that was blocked later.
So far, I'm impressed. Right now I'm listening to a CD I'd heard about but never had a chance to hear, a 1994 collection of covers of songs of one of favorite Danish bands in the seventies. I'd never expected to find a reasonably rare Danish item in a second after my first search.
Update: it seems that Spotify, refreshingly, is started in Europe or UK. And not yet available in North America. See comments for more info. | s3://commoncrawl/crawl-data/CC-MAIN-2020-29/segments/1593655887377.70/warc/CC-MAIN-20200705152852-20200705182852-00026.warc.gz | CC-MAIN-2020-29 | 815 | 3 |
https://wiki.azotel.com/2020-1q-billing-list-pending-subscription-under-customer-details-page | code | 2020-1Q: BILLING: List Pending Subscription under Customer Details page
A new feature has been added to allow for an operator to view "pending subscriptions".
This is useful for an operator to keep track of a customer's subscription that has yet to begin and the date in the future is unknown.
This information can be found under the "Customer Billing Details" section on the Customer page.
Fig 1 Subscription Details
Subscription Type can be modified in the Customer Subscription Table so that they are pending.
When the subscription needs to be activated the Subscription Type is changed to "Recurring", etc.
Fig 2 Subscription Type
Published Date: [23/Feb/2020]
Change History Log:
[enter revision date]
Contact Azotel Support:
Need more help? Save time by creating a maintenance ticket to Azotel through your instance. | s3://commoncrawl/crawl-data/CC-MAIN-2022-21/segments/1652662519037.11/warc/CC-MAIN-20220517162558-20220517192558-00178.warc.gz | CC-MAIN-2022-21 | 822 | 13 |
http://castechbytes.blogspot.com/2009/05/recovering-deleted-files-easily-even-if.html | code | I have deleted some files in the past month or so, and today realized that I needed them. So I had to try to find a way to delete files which I also deleted in the "Trash" folder of my Ubuntu. I thought I had to download the big video files again, but I learned better.
Using PhotoRec, I was able to recover those files. And it was easy.
It uses a somehow scary command line interface, but the how-to provided made it so easy--whatever operating system your computer may have.
PhotoRec works with Linux, Mac and Windows. So if you ever deleted something (whether using the graphical interface or command line), there is still hope.
If you need help on how to use this, you know where to find me.
Source: http://www.cgsecurity.org, accessed May 10, 2009. | s3://commoncrawl/crawl-data/CC-MAIN-2019-04/segments/1547583755653.69/warc/CC-MAIN-20190121025613-20190121051613-00283.warc.gz | CC-MAIN-2019-04 | 753 | 6 |
https://groups.google.com/g/bitblaze-users/c/odnnONJ48f8 | code | >>>>> "WW" == Wubing Wang <[email protected]
WW> I was encountered the problem: Fatal error: exception
", 392, 7). When I ran: fuzzball
WW> -linux-syscalls /bin/cat -- cat /etc/hostname
WW> 390 if phr.ph_type = 1L then (* PT_LOAD *)
WW> 391 (if phr.ph_flags = 5L && extra_vaddr = 0L then
WW> 392 assert(phr.vaddr = load_base);
WW> 393 if data_too || (phr.ph_flags <> 6L && phr.ph_flags <> 7L) then
WW> 394 load_segment fm ic phr extra_vaddr true)
WW> Do you have any idea why this problem happen?
WW> My system is ubuntu 14.04 LTS, 64-bit
Despite your system being 64-bit, did you nonetheless arrange for
/bin/cat to be a 32-bit binary? Usually on a 64-bit system that binary
would be 64-bit, so that would be my first guess as to the problem.
We've been working on FuzzBALL's 64-bit x86 support recently, which
would allow it to get to this point in the loading process, but I
wouldn't expect the whole program to function yet. Also at the moment
you need to specify the instruction set architecture if it's not
32-bit x86, so you'd need to supply "-arch x64" to run a 64-bit
binary, but if you make only that change I think it will just get a
bit further before failing.
As it happens, someone else sent me a similar question by private
email today as well, but without the hint of the system being 64-bit.
In answering that I brainstormed some other possible causes, which I
can go into if it's not, but after thinking about your question I
think the binary being 64-bit is the most likely problem. (This is the
very first example in the README, so I'm guessing you haven't gotten
any other programs to work yet either.)
You can use the "file" program to distinguish 32-bit from 64-bit
binaries. Usually on a 64-bit Linux system you can still compile
32-bit binaries by giving the -m32 option to gcc, and Debian and
Ubuntu can install 32-bit libraries in parallel using the "multiarch"
feature, but to get a complete set of 32-bit binaries and libraries
the most foolproof approach is to install a whole 32-bit system,
either in a virtual machine or a "chroot" virtual filesystem.
Hope this helps, | s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882572304.13/warc/CC-MAIN-20220816120802-20220816150802-00738.warc.gz | CC-MAIN-2022-33 | 2,100 | 36 |
https://discord.com/blog/how-discord-indexes-billions-of-messages | code | Millions of users send billions of messages on Discord every month. A way to search this history quickly became one of the most requested features we built. Let there be search!
- Cost Effective: The core user experience on Discord is our text and voice chat. Search is an accessory feature, and the price of the infrastructure needed to reflect that. Ideally this means search shouldn’t cost more than the actual storage of messages.
- Fast & Intuitive: All the features we build need to be fast and intuitive, including search. The search experience in our product needed to look and feel awesome to use too.
- Self-healing: We don’t have a dedicated devops team (yet), so search needed to be able to tolerate failures with minimal to no operator intervention.
- Linearly Scalable: Just like how we store messages, increasing the capacity of our search infrastructure should involve adding more nodes.
- Lazily Indexed: Not everyone uses search — we shouldn’t index messages unless someone attempts to search them at least once. Additionally, if an index fails, we needed to be able to re-index servers on the fly.
In looking at these requirements, we asked ourselves two key questions:
Q. Could we outsource search to a managed SaaS? (easymode)
A. Nope. Every solution we looked at that did managed search would have blown our budget (by an astronomically high margin) for this feature. Additionally, the thought of shipping messages out of our datacenter did not sit well with the team. As a security conscious team, we wanted to be in control of the security of users’ messages, not trusting a third party to know what they are doing.
Q. Is there an open source solution to search that we can use?
A. Yeah! We looked around and the conversation internally quickly came to Elasticsearch vs Solr, as both could be an appropriate fit for our use case. Elasticsearch had the edge:
- Node discovery on Solr requires ZooKeeper. We run etcd, and did not want to have additional infrastructure specifically for Solr. Elasticsearch’s Zen Discovery is self contained.
- Elasticsearch supports automatic shard rebalancing, which would let us add new nodes to the cluster, fulfilling the linearly scalable requirement out of the box.
- Elasticsearch has a structured query DSL built-in, whereas you’d have to programmatically create a query string with Solr using a third party library.
- Engineers on the team had more experience working with Elasticsearch
Would Elasticsearch Work?
Elasticsearch seemed to have everything we wanted and our engineers had experience working with it in the past. It provided a way to replicate data across different nodes to tolerate the failure of a single node, scale up the cluster by adding more nodes, and could ingest messages to be indexed without breaking a sweat. Reading around, we heard some horror stories about managing large Elasticsearch clusters, and really none of our backend team had any experience with managing Elasticsearch clusters apart from our logging infrastructure.
We wanted to avoid these cumbersome, large clusters, so we came up with the idea to delegate sharding and routing to the application layer, allowing us to index messages into a pool of smaller Elasticsearch clusters. This meant that in the event of a cluster outage only Discord messages contained on the affected cluster would be unavailable for searching. This also gave us the advantage of being able to throw away an entire cluster’s data should it become unrecoverable (the system is able to lazily re-index the Discord server the next time a user performs a search).
Elasticsearch likes it when documents are indexed in bulk. This meant that we couldn’t index messages as they were being posted in real time. Instead, we designed a queue in which a worker grabs a bunch of messages and indexes them within in a single bulk operation. We decided that this small delay between when a message was posted and when it became searchable was a perfectly reasonable constraint. After all, most users search for messages said historically, not something just said.
On the ingest side, we needed a few things:
- Message Queue: We needed a queue that we can put messages into as they are posted in real time (to be consumed by a pool of workers).
- Index Workers: Workers that do the actual routing and bulk inserts into Elasticsearch from the queue.
We built a task queuing system on top of Celery already, so we leveraged it also for our historical index workers.
- Historical Index Workers: Workers responsible for iterating through the message history in a given server and inserting them into the Elasticsearch index.
We also needed a quick and easy mapping of which Elasticsearch cluster and index a Discord server’s messages would reside on. We call this “cluster + index” pair a Shard (not to be confused with Elasticsearch’s native shards within an index). The mapping we created comes in two layers:
- Persistent Shard Mapping: We put this on Cassandra, our primary data store for persistent data, as the source of truth.
- Shard Mapping Cache: When we’re ingesting messages on our workers, querying Cassandra for a Shard is a slow operation. We cache these mappings in Redis, so that we can do mget operations to quickly figure out where a message needs to be routed to.
When a server is being indexed for the first time, we also needed a way to select which Shard to hold a Discord server’s messages on. Since our Shards are an application layered abstraction, we can be a bit smart about how to allocate them. By harnessing the power of Redis, we used a sorted set to build a load aware shard allocator.
- Shard Allocator: Using a sorted set in Redis we keep a set of the Shards with a score that represents their load. The Shard with the lowest score is the shard that should be allocated next. The score gets incremented with each new allocation, and each message that is indexed in Elasticsearch has a probability to increment the score of its Shard too. As Shards get more data in them they have a less likely chance of being allocated to a new Discord server.
Of course, this entire search infrastructure would be incomplete without a way to discover clusters and the hosts within them from the application layer.
- etcd: We use etcd for service discovery in other parts of our system, so we also used it for our Elasticsearch clusters. Since nodes in a cluster can announce themselves onto etcd for the rest of the system to see, we don’t have to hardcode any Elasticsearch topologies.
Finally, we needed a way for clients to be able to actually search things.
- Search API: An API endpoint that clients can issue search queries to. It needed to do all the permission checks to make sure that clients are only searching messages they actually have access to.
Indexing & Mapping the Data
At a really high level, in Elasticsearch, we have the concept of an “index,” containing a number of “shards” within it. A shard in this case is actually a Lucene index. Elasticsearch is responsible for distributing the data within an index to a shard belonging to that index. If you want, you can control how the data is distributed amongst the shards by using a “routing key.” An index can also contain a “replication factor,” which is how many nodes an index (and its shards within) should be replicated to. If the node that the index is on fails a replica can take over (Unrelated but related, these replicas can also serve search queries, so you can scale the search throughput of the index by adding more replicas).
Since we handed all of the sharding logic in the application level (our Shards), having Elasticsearch do the sharding for us didn’t really make sense. However, we could use it to do replication and balancing of the indices between nodes in the cluster. In order to have Elasticsearch automatically create an index using the correct configuration, we used an index template, which contained the index configuration and data mapping. The index configuration was pretty simple:
- The index should only contain one shard (don’t do any sharding for us)
- The index should be replicated to one node (be able to tolerate the failure of the primary node the index is on)
- The index should only refresh once every 60 minutes (why we had to do this is explained below).
- The index contains a single document type: message
Storing the raw message data in Elasticsearch made little sense as the data was not in a format that was easily searchable. Instead, we decided to take each message, and transform it into a bunch of fields containing metadata about the message that we can index and search on:
You’ll notice that we didn’t include timestamp in these fields, and if you recall from our previous blog post, our IDs are Snowflakes, which means they inherently contain a timestamp (which we can use to power before, on, and after queries by using a minimum and maximum ID range).
These fields however aren’t actually “stored” in Elasticsearch, rather, they are only stored in the inverted index. The only fields that are actually stored and returned are the message, channel and server ID that the message was posted in. This means that message data is not duplicated in Elasticsearch. The tradeoff being that we’ll have to fetch the message from Cassandra when returning search results, which is perfectly okay, because we’d have to pull the message context (2 messages before & after) from Cassandra to power the UI anyway. Keeping the actual message object out of Elasticsearch means that we don’t have to pay for additional disk space to store it. However, this means we can’t use Elasticsearch to highlight matches in search results. We’d have to build the tokenizers and language analyzers into our client to do the highlighting (which was really easy to do).
Actually coding it.
We decided that a microservice for search was probably not required, and instead we exposed a library that wrapped our routing and querying logic to Elasticsearch. The only additional service we needed to run is the index workers (which would use this library to do the actual indexing work). The API surface area exposed to the rest of the team was also minimal, so that if it did need to be moved to it’s own service, it could easily be wrapped in an RPC layer. This library could be imported by our API workers as well to actually execute the search queries and return results to the user over HTTP.
To the rest of the team, the library exposed a minimal surface area for searching messages:
Queueing a message to be indexed or deleted:
Bulk indexing (roughly) real time messages within a worker:
For indexing a server’s historical messages, a historical index job which would perform a unit of work and return the next job that needed to run to continue indexing that server. Each job represents a cursor into a server’s message history and a fixed unit of execution (in this case defaulting to 500 messages). The job returns a new cursor to the next batch of messages to be indexed or None if there is no more work to be done. In order to return results quickly for a large server, we split the historical indexing into two phases, an “initial” and “deep” phase. The “initial” phase indexes the last 7 days of messages on the server and makes the index available to the user. After that, we index the entire history in the “deep” phase, which executes at a lower priority. This article shows what it looks like to the user. These jobs are executed in a pool of celery workers, allowing for scheduling of the jobs amongst other tasks that the workers run. This roughly looks like:
Testing It Out On Production
After coding this up and testing it on our development environment, we decided it was time to see how it’d perform on production. We spun up a single Elasticsearch cluster with 3 nodes, provisioned the index workers, and scheduled 1,000 of the largest Discord servers to be indexed. Everything seemed to be working, however when looking at the metrics on the cluster, we noticed two things:
- CPU usage way higher than expected.
- Disk usage was growing way too fast for the volume of messages being indexed.
We were pretty confused, and after letting it run for a while and use up way too much disk space, we cancelled the index jobs and called it for the night. Something wasn’t quite right.
When we came back the following morning we noticed that disk usage had shrunk by A LOT. Did Elasticsearch throw away our data? We tried issuing a search query on one of the servers that we indexed that one of us was in. Nope! The results were being returned just fine — and fast too! What gives?
Disk Usage Growing Fast then Tapering Off
After doing some research, we came up with a hypothesis! By default, Elasticsearch has its index refresh interval set to 1 second. This is what provides the “near real-time” search ability in Elasticsearch. Every second (across a thousand indexes) Elasticsearch was flushing the in-memory buffer to a Lucene segment, and opening the segment to make it searchable. Over night, while idle, Elasticsearch merged the massive amounts of tiny segments it generated into much larger (but more space efficient) ones on disk.
Testing this out was pretty simple: We dropped all the indexes on the cluster, set the refresh interval to an arbitrarily large number, and we then scheduled the same servers to be indexed. CPU usage was down to almost nothing while the documents were being ingested, and disk usage was not growing at an alarmingly high rate. Huzzah!
Disk Usage After Decreasing the Refreshing Interval
Unfortunately, however, turning off the refresh interval doesn’t work in practice…
It had become apparent that Elasticsearch’s automatic near real-time index availability wouldn’t work for our needs. Chances are a server could go hours without needing to execute a single search query. We needed to build a way to control the refreshing from the application layer. We did this through an expiring hashmap in Redis. Given that servers on Discord are sharded into shared indexes on Elasticsearch, we can build a quick map that is updated alongside the index, tracking if an index needs to be refreshed — given the server you are searching in. The data structure was simple: the Redis key storing the hashmap was prefix + shard_key to a hashmap of guild_id to a sentinel value saying that it needed to be refreshed. In retrospect, this could have probably been a set.
The indexing lifecycle thus turned into:
- Take N messages from the queue.
- Figure out where these messages should be routed to by their guild_id.
- Execute bulk insert operations to the relevant clusters.
- Update the Redis mappings, signifying that the shard and the given guild_ids within the Shard that were updated are now dirty. Expire this key after 1 hour (as Elasticsearch would have auto-refreshed by then).
And the search lifecycle turned into:
- Look up the Shard that needs to be queried for the guild_id.
- Check the Redis mapping to see if the Shard AND guild_id is dirty.
- If dirty, do a refresh of the Shard’s Elasticsearch Index, and mark the entire Shard as clean.
- Execute the search query and return results.
You may have noticed that even though we now explicitly control the refreshing logic on Elasticsearch, we still have it auto-refresh the underlying index every hour. If data-loss occurs on our Redis mapping it’d take at most one hour for the system to correct itself automatically.
Since deploying in January, our Elasticsearch infrastructure has grown to 14 nodes across 2 clusters, using the n1-standard-8 instance type on GCP with 1TB of Provisioned SSD each. The total document volume is almost 26 billion. The rate of indexing peaked at approximately 30,000 messages per second. Elasticsearch has handled it without a sweat — remaining at 5–15% CPU throughout our roll-out of search.
So far, we’ve been able to add more nodes to the clusters with ease. At some point, we will spin up more clusters so that new Discord servers being indexed land on them (thanks to our weighted shard distribution system). On our existing clusters, we’ll need to limit the number of master eligible nodes as we add more data nodes to the cluster.
We’ve also stumbled upon 4 primary metrics that we use to decide when the cluster needs to be grown:
- heap_free: (aka heap_committed — heap_used) When we run out of free heap space, the JVM is forced to do a full stop-the-world GC to quickly reclaim space. If it fails to reclaim enough space, the node will crash and burn. Before then, the JVM will get into a state where it’s doing stop-the-world GCs constantly as the heap fills up and too little memory is freed during each full GC. We look at this along with GC stats to see how much time is spent garbage collecting.
- disk_free: Obviously when we run out of disk space, we’ll need to add more nodes, or more disk space to handle new documents being indexed. This is very easy on GCP as we can just grow the size of the disk without rebooting the instance. Choosing between adding a new node or resizing disks depends on how the other metrics mentioned here look. For example, if disk usage is high, but the other metrics are at acceptable levels, we’ll choose to add more disk space rather than a new node.
- cpu_usage: If we reach a threshold of CPU usage during peak hours.
- io_wait: If the IO operations on the cluster are getting too slow.
Unhealthy Cluster (ran out of heap)
Heap Free (MiB)
Time Spent GC/s
Heap Free (GiB)
Time Spent GC/s
It’s now a little over three months since we’ve launched the search feature, and so far this system has held up with little to no issues.
Elasticsearch has shown stable and consistent performance from 0 to 26 billion documents across around 16,000 indices and millions of Discord servers. We’ll continue to scale by adding more clusters or more nodes to existing clusters. At some point, we may consider writing code that allows us to migrate indices between clusters as a way to shed load from a cluster or give a Discord server it’s own index if it’s an exceptionally chatty server (though our weighted sharding system does a good job of making sure large Discord servers usually get their own shards currently).
We are hiring, so come join us if this type of stuff tickles your fancy. | s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710933.89/warc/CC-MAIN-20221203143925-20221203173925-00159.warc.gz | CC-MAIN-2022-49 | 18,439 | 87 |
https://hoyelam.com/how-to-get-better-feedback/ | code | Good working people always hunger for feedback.
They desire to improve their current skills or even learn new ones. Based on feedback or tips they can improve themselves even further. This increases their focus, productivity, and even enjoyment in working.
But if you ask for feedback you may get generic answers like:
‘I like what you are doing, so keep it!’
‘Yeah, you can improve on skill X’. These answers are useless as feedback.
So how do you ask for feedback?
First of all, do not ask the other person very generic questions like: ‘Hey, do you have feedback or tips for me?’ or something similar.
This question is too broad. It is like asking me what I think of you and I will answer: you are good, nice blah blah blah. Basically, you will get useless information.
Instead, ask specific questions. For example, keep track of what you did in the past days to weeks. Other people’s memories are still fresh from that period. Thus they can recall more. If you had a meeting and what to know how you handled it, ask:
‘How did you think I did the presentation at that meeting?’
‘Do you think I was a bit too blunt in that meeting?’.
If I get these questions, I can answer more specific and give for example:
Yeah, the presentation was great, I missed some more information about X. Think that could be included as well.
No, you weren’t too blunt. In fact, it is great since we are not straying away from our goals. To improve you can provide one or two arguments extra for your points.
I’m 100% sure that this feedback is more useful than the generic ‘Yes, you are great’.
By asking the right questions, you get better answers and thus feedback.
Second, let your colleagues know that you want to improve on something specific. For example, you may want to improve your documentation skills. Tell that to your colleagues.
Your colleagues will get a better idea of what kind of feedback you want. If you told them and ask them a week later:
‘Hey, as you know I’m trying to improve my documentation skills. What did you think of document X and Y? ’.
They have more context of what kind of feedback you want. Then they are more prone to giving you the right feedback.
To get the right feedback, you need to be specific and open about it. Generic questions like ‘can you give me feedback?’ will result in generic answers. Which are useless in most cases.
Ask what they thought about a specific piece of work or the social setting of you. Only then you get a higher chance of getting something useful.
Be specific and ask the right questions for the feedback! | s3://commoncrawl/crawl-data/CC-MAIN-2024-18/segments/1712296819971.86/warc/CC-MAIN-20240424205851-20240424235851-00450.warc.gz | CC-MAIN-2024-18 | 2,595 | 23 |
http://stackoverflow.com/questions/5797983/setting-value-for-publisher-size-and-version-in-win-7s-programs-and-features/5798042 | code | I have a Java application that is running on Windows 7. When I look at the uninstaller inside control panel>Programs and Features I see that other apps have values for Publisher, Size, and Version.
I would like to set these values in my application, but I do not know how.
Could any of you kind people please point me to an article or explain to me how I can accomplish this? I've done a bit of searching but I am not coming up with anything. | s3://commoncrawl/crawl-data/CC-MAIN-2014-10/segments/1394678677656/warc/CC-MAIN-20140313024437-00074-ip-10-183-142-35.ec2.internal.warc.gz | CC-MAIN-2014-10 | 442 | 3 |
https://theiapolis.com/actor-1Y9K/frank-moro/filmography/ | code | A non-exhaustive filmography for Frank Moro, an actor.
As an actor, Frank Moro worked on movies such as .
Please note that, at this time, we are not yet listing TV-series, or other performances.
If you want to report mistakes or ommission, please do not hesitate to post it in the comments bellow! Thank you! | s3://commoncrawl/crawl-data/CC-MAIN-2019-51/segments/1575540527205.81/warc/CC-MAIN-20191210095118-20191210123118-00207.warc.gz | CC-MAIN-2019-51 | 308 | 4 |
https://science-math.wright.edu/mathematics-and-statistics/departmental-advising-and-resources | code | Students are advised in the college of their admitted or intending major.
Find Your Primary Advisor
- Log in to WINGS Express.
- Select Registration and Records.
- Click View Student Information. The name of your advisor will be listed under Student Information.
Course Planning Schedules
IMPORTANT: The schedules below are tentative. Specific courses offered, dates, and times are subject to change depending on instructor and classroom availability, student interest, and university requirements.
Lower-Level Undergraduate Courses
The department offers the following courses every fall and spring semester:
- MTH 1270: Introduction to Functions and Modeling
- MTH 1280 College Algebra
- MTH 1350 Analytic Geometry and Trigonometry
- MTH 1450 Mathematics and the Modern World
- MTH 2240 Applied Calculus
- MTH 2280 Business Calculus
- MTH 2300 Calculus I
- MTH 2310 Calculus II
- MTH 2320 Calculus III
- MTH 2350 Differential Equations with Matrix Algebra
- MTH 2415: Elementary Mathematics Concepts for Educators I
- MTH 2435: Elementary Mathematics Concepts for Educators II
- MTH 2570 Discrete Mathematics for Computing
- STT 1600 Statistical Concepts
- STT 2640 Elementary Statistics | s3://commoncrawl/crawl-data/CC-MAIN-2023-40/segments/1695233510983.45/warc/CC-MAIN-20231002064957-20231002094957-00173.warc.gz | CC-MAIN-2023-40 | 1,188 | 24 |
https://duncanlock.net/tag/fat32.html | code | Ever tried to copy something onto a USB flash drive, only to discover that the file was too big to copy?
This is because most USB Flash drives are formatted using the FAT32 filesystem - which only supports individual files up to 4 GB in size, no matter how much free space you’ve got. It also only supports drives up to 2 TB, can’t store symbolic links, can’t store files with these characters in the name:
"*/:<>?\| – and is generally pretty crappy.
The solution is to use a better filesystem
There are many filesystems to choose from - but there are a few criteria that a good USB flash drive filesystem needs to meet which cut down the choices quite a bit:
- Compatible with every computer - just plug it in, and it should work
- Has no permissions, or optional … | s3://commoncrawl/crawl-data/CC-MAIN-2024-10/segments/1707947473824.45/warc/CC-MAIN-20240222193722-20240222223722-00676.warc.gz | CC-MAIN-2024-10 | 776 | 7 |
https://jsonapi-resources.com/v0.10/guide/resource_caching.html | code | To improve the response time of GET requests, JR can cache the generated JSON fragments for some or all of your Resources, using a key-based cache expiration system.
To begin, set
config.resource_cache to an ActiveSupport cache store:
JSONAPI.configure do |config|
Then, on each Resource you want to cache, call the
class PostResource < JSONAPI::Resource
The Post model in this example must have the Rails timestamp field
See the “Resources to Not Cache” section below for situations where you might not want to enable caching on particular Resources.
Also, when caching is enabled, please be careful about direct database manipulation. If you alter a database row without changing the
updated_at field, the cached entry for that resource will be inaccurate.
Instead of the default
updated_at, you can use a different field (and take on responsibility for updating it) by calling the
class Post < ActiveRecord::Base
One reason to do this is that
updated_at provides a narrow race condition window. If a resource is updated twice in the same second, it’s possible that only the first update will be cached. If you’re concerned about this, you will need to find a way to make sure your models’ cache fields change on every update, e.g. by using a unique random value or a monotonic clock.
JR does not actively clean the cache, so you must use an ActiveSupport cache that automatically expires old entries, or you will leak resources. The default behavior of Rails’ MemoryCache is good, but most other caches will have to be configured with an
:expires_in option and/or a cache-specific clearing mechanism. In a Redis configuration for example, you will need to set
maxmemory to a reasonably high size, and set
Also, sometimes you may want to manually clear the cache. If you make a code change that affects serialized representations (i.e. changing the way an attribute is shown), or if you think that there might be invalid cache entries, you can clear the cache by running
JSONAPI.configuration.resource_cache.clear from the console.
You do not have to manually clear the cache after merely adding or removing attributes on your Resource, because the field list is part of the cache key.
JSONAPI.configuration.resource_cache, you may still choose to leave some Resources uncached for various reasons:
- If your Resource is not based on ActiveRecord, e.g. it uses PORO objects or singleton resources or a different ORM/ODM backend. Caching relies on ActiveRecord features.
- If a Resource’s attributes depend on many things outside that Resource, e.g. flattened relationships, but it would be too cumbersome to have all those touch the parent resource on every change.
- If the content of attributes is affected by context in a way that is too difficult to handle with
attribute_caching_context, as described below.
If context affects the output of any method providing the actual content of an attribute, or the
fetchable_fields methods, then you must provide a class method on your Resource named
attribute_caching_context. This method should a subset of the context that is (a) serializable and (b) uniquely identifies the caching situation:
class PostResource < JSONAPI::Resource
This is necessary because cache lookups do not create instances of your Resource, and therefore cannot call instance methods like
fetchable_fields. Instead, they have to rely on finding the correct cached representation of the resource, the one generated when these methods were called with the correct context the first time. The attribute caching context is a way to let JR “sub-categorize” your cache by the various parts of your context that affect these instance methods. The same mechanism is also used internally by JR when clients request sparse fieldsets; a cached sparse representation and the cached representation with all attributes don’t collide with each other.
This becomes trickier if you depend on the state of the model, not just on the state of the context. For example, suppose you have a
UserResource#fetchable_fields that excludes
If you write a custom
ResourceSerializer which takes new options, then you must define
config_description to include those options if they might impact the serialized value:
class MySerializer < JSONAPI::ResourceSerializer | s3://commoncrawl/crawl-data/CC-MAIN-2023-40/segments/1695233510214.81/warc/CC-MAIN-20230926143354-20230926173354-00278.warc.gz | CC-MAIN-2023-40 | 4,279 | 38 |
https://www.mindarie.wa.edu.au/courses/year-11/mathematics-applications-atar | code | A CASIO CAS calculator is required for this course.
- Course Code - AEMAA
- University Pathway
Estimated Cost: $50.00
Based on previous years pricing and subject to change.
This course focuses on the use of mathematics to solve problems in contexts that involve financial modelling, geometric and trigonometric analysis, graphical and network analysis, and growth and decay in sequences.
It also provides opportunities for students to develop systematic strategies based on the statistical investigation process for answering questions that involve analysing univariate and bivariate data, including time series data.
The Mathematics Applications ATAR course aims to develop students’:
- understanding of concepts and techniques drawn from, and ability to solve applied problems from, the topic areas of number and algebra, geometry and trigonometry, graphs and networks, and statistics
- reasoning and interpretive skills in mathematical and statistical contexts
- capacity to communicate the results of a mathematical or statistical problem-solving activity in a concise and systematic manner using appropriate mathematical and statistical language
- capacity to choose and use technology appropriately and efficiently.
Minimum Entrance Requirements
C grade in Year 10 Mathematics
Year 12 - Mathematics Applications ATAR
Course Code ATMAA
RETURN TO COURSES | s3://commoncrawl/crawl-data/CC-MAIN-2023-23/segments/1685224643585.23/warc/CC-MAIN-20230528051321-20230528081321-00634.warc.gz | CC-MAIN-2023-23 | 1,360 | 17 |
https://pythonawesome.com/abandoned-plan-for-a-clone-of-the-old-flash-game-star-relic/ | code | When I was in middle school, I was a fan of the Flash game Star Relic (no longer playable in modern browsers, but it works alright in Flashpoint if you fiddle around with the graphics settings a bit.) The idea was that you had a bunch of spaceships on a hexagonal grid, and you could move them around and have them shoot at things in a turn-based combat system. I liked that idea quite a bit and decided to give cloning it a go. Unfortunately I was nowhere near advanced enough to do things like render real graphics, but I did at least get a square grid rendered where you could point and click using the ~same interface as Star Relic to move things around and shoot. That's what you'll find in this repository. | s3://commoncrawl/crawl-data/CC-MAIN-2024-10/segments/1707948217723.97/warc/CC-MAIN-20240305024700-20240305054700-00387.warc.gz | CC-MAIN-2024-10 | 712 | 1 |
https://www.reddit.com/r/starcitizen/comments/7pziwy/um_i_have_a_retaliator/ | code | So, I've been suffering from the "REC weapons after 3.0" bug, so I've been trying various things in order to try to work around it.
I just got access to hangars again. I'm not sure why, but I suspect it's because I bought a one-month subscription so I got those hangars. I tried to use Noobifier's workaround (load out your ship in the hangar, then go straight into AC and the loadout will persist) but it wasn't working. So, I changed the loadout, QUIT GAME (instead of exit to menu), then restarted. I jumped directly into AC Free Flight, and I was in an Aegis Retaliator.... Um, really cool but how the hell did THAT happen? :) | s3://commoncrawl/crawl-data/CC-MAIN-2018-47/segments/1542039742978.60/warc/CC-MAIN-20181116045735-20181116071735-00317.warc.gz | CC-MAIN-2018-47 | 630 | 2 |
https://www.oreilly.com/library/view/the-credit-default/9781576602362/ | code | The growth of the credit derviatives market has produced a liquid market in credit default swaps across the credit curve, and this liquidity has led many investors to access both the credit derivative and cash bond markets to meet their investment requirements.
This book investigates the close relationship between the synthetic and cash markets in credit, which manifests itself in the credit default swap basis. Choudhry covers the factors that drive the basis, implications for market participants, the CDS index basis, and trading the basis.
Credit market investors and traders as well as anyone with an interest in the global debt markets will find this insightful and rewarding.
THIS BOOK QUALIFIES FOR 7 PD CREDITS UNDER THE GUIDELINES OF THE CFA INSTITUTE PROFESSIONAL DEVELOPMENT PROGRAM.
Table of contents
- Cover Page
- Title Page
- Expanded Contents
- ABOUT THE AUTHOR
- CHAPTER 1: A Primer on Credit Default Swaps
- CHAPTFR 2: Bond Spreads and Relative Value
- CHAPTER 3: The CDS Basis I: The Relationship Between Cash and Synthetic Credit Markets
- CHAPTER 4: Supply and Demand and the Credit Default Swap Basis
- CHAPTER 5: The CDS Basis II: Further Analysis of the Cash and Synthetic Credit Market Differential
- CHAPTER 6: Trading the CDS Basis: Illustrating Positive and Negative Basis Arbitrage Trades
- ABOUT BLOOMBERG
- A FRESH VIEW ON FIXED INCOME
- Title: The Credit Default Swap Basis
- Release date: October 2006
- Publisher(s): Bloomberg Press
- ISBN: 9781576602362
You might also like
51+ hours of video instruction. Overview The professional programmer’s Deitel® video guide to Python development with …
Kafka: The Definitive Guide, 2nd Edition
Every enterprise application creates data, whether it consists of log messages, metrics, user activity, outgoing messages, …
Building Microservices, 2nd Edition
Distributed systems have become more fine-grained as organizations shift from code-heavy monolithic applications to smaller, self-contained …
Fluent Python, 2nd Edition
Python’s simplicity lets you become productive quickly, but often this means you aren’t using everything it … | s3://commoncrawl/crawl-data/CC-MAIN-2021-43/segments/1634323585305.53/warc/CC-MAIN-20211020090145-20211020120145-00716.warc.gz | CC-MAIN-2021-43 | 2,128 | 29 |
http://www.renaissanceflyrods.com/blog/oregon-bound-grass-rods | code | First is a Blonde 2 piece, 8', 6wt. with 2 tips. That one is for dad. He likes the traditional 2 piece rods, so that's what he got, along with a spare tip. It is a deep green with black trim wraps. I used a nice red agate stripping guide and chrome hardware for the traditional look Bill likes.
Second is for son. His is a 3 piece, 8', 6wt. lightly flamed orange with black trim wraps. Joe requested a longer handle due to his big old meat hooks. Joe's rod has a wrap matching agate stripper with black nickel hardware.
Until next time my friends, | s3://commoncrawl/crawl-data/CC-MAIN-2020-40/segments/1600400220495.39/warc/CC-MAIN-20200924194925-20200924224925-00693.warc.gz | CC-MAIN-2020-40 | 547 | 3 |
http://askubuntu.com/questions/227743/problems-installing-from-disk | code | I have 267GB of unallocated space that I want to install onto. (Installing eOS, but the installer is the same). I was going to use the option for a custom install, but didn't know where to put the boot loader. I use Windows7. I also wanted to know if I should create a new partition from Windows. Or if I can do that from the installer. Thanks!
Since you have said that the installer is the same, I am just giving instructions based on my experience from installing Ubuntu.
Under the section
You can create the new partition from the installer itself. | s3://commoncrawl/crawl-data/CC-MAIN-2016-26/segments/1466783397696.49/warc/CC-MAIN-20160624154957-00082-ip-10-164-35-72.ec2.internal.warc.gz | CC-MAIN-2016-26 | 551 | 4 |
https://thwack.solarwinds.com/thread/19602 | code | Currently running with V9 at the moment and wondered if it was going to be possible to add a drop down that lists all the SNMP credentials that are used. This is available in the Systems Manager console at the moment but does not seem available from the web front end.
Also is it possible to default the system to always select the CPU & Memory rather than having to select it each time.
I've added it to our database. We'll consider it. | s3://commoncrawl/crawl-data/CC-MAIN-2018-39/segments/1537267160568.87/warc/CC-MAIN-20180924145620-20180924170020-00197.warc.gz | CC-MAIN-2018-39 | 437 | 3 |
https://livingyourgreatness.libsyn.com/website/be-the-captain-of-your-ship-build-financial-confidence-and-have-a-war-chest-featuring-jay-martin | code | Apr 29, 2021
Cambridge House International Inc., President & CEO, Jay Martin, talks about the importance of investing, developing healthy habits, and building financial wealth. Jay is a people-driven investor, where he emphasizes "the captain of the ship over anything else" when looking at investment opportunities. Cambridge House produces the largest investment conferences in both technology and natural resources in Canada.
Podcast Host: Ben Mumme
Subscribe to the YouTube Channel: https://bit.ly/3fAcFrt
Connect with Ben on Twitter: https://twitter.com/mumme_ben
Join the #LivingYourGreatnesss community: https://www.instagram.com/livingyourgreatness/
Connect with Jay Martin on Twitter (https://twitter.com/JayMartinBC), Instagram (https://www.instagram.com/jaymartinbc/), check out his website (https://cambridgehouse.com/u/2207/jay-martin) and subscribe to his YouTube channel (https://www.youtube.com/results?search_query=jay+martin+cambridge+house). | s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882570868.47/warc/CC-MAIN-20220808152744-20220808182744-00225.warc.gz | CC-MAIN-2022-33 | 960 | 7 |
https://hub.vilarejo.pro.br/channel/gadgeteer/?cat=cli | code | BASH (Bourne Again SHell) is the default shell in practically all Linux-based operating systems. All the commands we write in the terminal are interpreted by the shell, and become part of its history. In this tutorial, we see where the shell history is saved, and how to manage it using the “history” built-in command and some environment variables.
The tutorial covers where and how the history is saved, how to modify history behaviour via environmental variables, etc. So it is not the normal "how to use history" guides, but it is worth just mentioning, if you want to repeat a command by its number, you can quickly use !13 for the line numbered 13.
See How to manage Bash history
... Read more | s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710711.7/warc/CC-MAIN-20221129200438-20221129230438-00344.warc.gz | CC-MAIN-2022-49 | 703 | 4 |
https://techstalking.com/programming/python/typeerror-list-object-cannot-be-interpreted-as-an-integer/ | code | Each Answer to this Q is separated by one/two green lines.
playSound function is taking a list of integers, and is going to play a sound for every different number. So if one of the numbers in the list is
1 has a designated sound that it will play.
def userNum(iterations): myList = for i in range(iterations): a = int(input("Enter a number for sound: ")) myList.append(a) return myList print(myList) def playSound(myList): for i in range(myList): if i == 1: winsound.PlaySound("SystemExit", winsound.SND_ALIAS)
I am getting this error:
TypeError: 'list' object cannot be interpreted as an integer
I have tried a few ways to convert the list to integers. I am not too sure what I need to change. I am sure that there is a more efficient way of doing this. Any help would be very greatly appreciated.
Error messages usually mean precisely what they say. So they must be read very carefully. When you do that, you’ll see that this one is not actually complaining, as you seem to have assumed, about what sort of object your list contains, but rather about what sort of object it is. It’s not saying it wants your list to contain integers (plural)—instead, it seems to want your list to be an integer (singular) rather than a list of anything. And since you can’t convert a list into a single integer (at least, not in a way that is meaningful in this context) you shouldn’t be trying.
So the question is: why does the interpreter seem to want to interpret your list as an integer? The answer is that you are passing your list as the input argument to
range, which expects an integer. Don’t do that. Say
for i in myList instead.
For me i was getting this error because i needed to put the arrays in paratheses. The error is a bit tricky in this case…
concatenate((a, b)) is right
hope that helps.
The error is from this:
def playSound(myList): for i in range(myList): # <= myList is a list, not an integer
You cannot pass a list to
range which expects an integer. Most likely, you meant to do:
def playSound(myList): for list_item in myList:
def playSound(myList): for i in range(len(myList)):
def playSound(myList): for i, list_item in enumerate(myList):
range is expecting an integer argument, from which it will build a range of integers:
>>> range(10) range(0, 10) >>> list(range(10)) [0, 1, 2, 3, 4, 5, 6, 7, 8, 9] >>>
Moreover, giving it a list will raise a
range will not know how to handle it:
>>> range([1, 2, 3]) Traceback (most recent call last): File "<stdin>", line 1, in <module> TypeError: 'list' object cannot be interpreted as an integer >>>
If you want to access the items in
myList, loop over the list directly:
for i in myList: ...
>>> myList = [1, 2, 3] >>> for i in myList: ... print(i) ... 1 2 3 >>>
for i in myList
range takes in an integer. you want for each element in the list.
You should do this instead:
for i in myList: # etc.
That is, remove the
range() part. The
range() function is used to generate a sequence of numbers, and it receives as parameters the limits to generate the range, it won’t work to pass a list as parameter. For iterating over the list, just write the loop as shown above.
since it’s a list it cannot be taken directly into range function as the singular integer value of the list is missing.
for i in range(len(myList)):
with this, we get the singular integer value which can be used easily
playSound(), instead of
for i in range(myList):
for i in myList:
This will iterate over the contents of
myList, which I believe is what you want.
range(myList) doesn’t make any sense.
def userNum(iterations): myList = for i in range(iterations): a = int(input("Enter a number for sound: ")) myList.append(a) print(myList) # print before return return myList # return outside of loop def playSound(myList): for i in range(len(myList)): # range takes int not list if i == 1: winsound.PlaySound("SystemExit", winsound.SND_ALIAS)
n=input().split() ar= for i in n: if i not in ar: ar.append(i) print(*ar)
We usually pass string as a integer… for that we have to type “n”
when we facing like this error:
st object cannot be interpreted as an
it usually because we use X instead len(X) in for loops
#error for i in range(df.index): pass #currect for i in range( len(df.index) ): pass | s3://commoncrawl/crawl-data/CC-MAIN-2022-40/segments/1664030334528.24/warc/CC-MAIN-20220925101046-20220925131046-00127.warc.gz | CC-MAIN-2022-40 | 4,247 | 53 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.