Shashwat13333 commited on
Commit
75cb577
·
verified ·
1 Parent(s): c774b8e

Model save

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 768,
3
+ "pooling_mode_cls_token": true,
4
+ "pooling_mode_mean_tokens": false,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,900 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ license: apache-2.0
5
+ tags:
6
+ - sentence-transformers
7
+ - sentence-similarity
8
+ - feature-extraction
9
+ - generated_from_trainer
10
+ - dataset_size:150
11
+ - loss:MatryoshkaLoss
12
+ - loss:MultipleNegativesRankingLoss
13
+ base_model: BAAI/bge-base-en-v1.5
14
+ widget:
15
+ - source_sentence: Do you provide support 24/7?
16
+ sentences:
17
+ - 'How can we get started with your DevOps solutions?
18
+
19
+ Getting started is easy. Contact us through our website. We''ll schedule a consultation
20
+ to discuss your needs, evaluate your current infrastructure, and propose a customized
21
+ DevOps solution designed to achieve your goals.'
22
+ - 'This is our Portfolio
23
+
24
+ Introducing the world of Housing Finance& Banking Firm.
25
+
26
+ Corporate Website with 10 regional languages in India with analytics and user
27
+ personalization and Dashboard for Regional Managers, Sales Agents, etc. to manage
28
+ the Builder Requests, approve/deny Properties, manage visits and appointments,
29
+ manage leads, etc.
30
+
31
+
32
+
33
+ Introducing the world of Global Automotive Brand.We have implemented a Multi Locale
34
+ Multilingual Omnichannel platform for Royal Enfield. The platform supports public
35
+ websites, customer portals, internal portals, business applications for over 35+
36
+ different locations all over the world.
37
+
38
+
39
+ Developed Digital Platform for Students, Guardians, Teachers, Tutors, with AI/ML
40
+ in collaboration with Successive Technologies Inc, USA. Cloud, Dev-Sec-Ops &
41
+ Data Governance
42
+
43
+ Managing cloud provisioning and modernization alongside automated infrastructure,
44
+ event-driven microservices, containerization, DevOps, cybersecurity, and 24x7
45
+ monitoring support ensures efficient, secure, and responsive IT operations.'
46
+ - 'We are a New breed of innovative digital transformation agency, redefining storytelling
47
+ for an always-on world.
48
+
49
+ With roots dating back to 2017, we started as a pocket size team of enthusiasts
50
+ with a goal of helping traditional businesses transform and create dynamic, digital
51
+ cultures through disruptive strategies and agile deployment of innovative solutions.'
52
+ - source_sentence: What services do you offer for AI adoption?
53
+ sentences:
54
+ - 'In what ways can machine learning optimize our operations?
55
+
56
+ Machine learning algorithms can analyze operational data to identify inefficiencies,
57
+ predict maintenance needs, optimize supply chains, and automate repetitive tasks,
58
+ significantly improving operational efficiency and reducing costs.'
59
+ - "At Techchefz Digital, we specialize in guiding companies through the complexities\
60
+ \ of adopting and integrating Artificial Intelligence and Machine Learning technologies.\
61
+ \ Our consultancy services are designed to enhance your operational efficiency\
62
+ \ and decision-making capabilities across all sectors. With a global network of\
63
+ \ AI/ML experts and a commitment to excellence, we are your partners in transforming\
64
+ \ innovative possibilities into real-world achievements. \
65
+ \ \
66
+ \ \n DATA INTELLIGENCE PLATFORMS we\
67
+ \ specialize in\nTensorFlow\nDatabricks\nTableau\nPytorch\nOpenAI\nPinecone\""
68
+ - "SERVICES WE PROVIDE\nFlexible engagement models tailored to your needs\nWe specialize\
69
+ \ in comprehensive website audits that provide valuable insights and recommendations\
70
+ \ to enhance your online presence.\nDigital Strategy & Consulting\nCreating digital\
71
+ \ roadmap that transform your digital enterprise and produce a return on investment,\
72
+ \ basis our discovery framework, brainstorming sessions & current state analysis.\n\
73
+ \nPlatform Selection\nHelping you select the optimal digital experience, commerce,\
74
+ \ cloud and marketing platform for your enterprise.\n\nPlatform Builds\nDeploying\
75
+ \ next-gen scalable and agile enterprise digital platforms, along with multi-platform\
76
+ \ integrations. \nProduct Builds\nHelp you ideate, strategize, and engineer\
77
+ \ your product with help of our enterprise frameworks\nInfrastructure\nSpecialize\
78
+ \ in multi-cloud infrastructure helping you put forward the right cloud infrastructure\
79
+ \ and optimization strategy.\n\nManaged Services\nOperate and monitor your business-critical\
80
+ \ applications, data, and IT workloads, along with Application maintenance and\
81
+ \ operations.\nTeam Augmentation\nHelp you scale up and augment your existing\
82
+ \ team to solve your hiring challenges with our easy to deploy staff augmentation\
83
+ \ offerings.\""
84
+ - source_sentence: What challenges did the company face in its early days?
85
+ sentences:
86
+ - 'How do we do Custom Development ?
87
+
88
+ We follow below process to develop custom web or mobile Application on Agile Methodology,
89
+ breaking requirements in pieces and developing and shipping them with considering
90
+ utmost quality:
91
+
92
+ Requirements Analysis
93
+
94
+ We begin by understanding the client's needs and objectives for the website.
95
+ Identify key features, functionality, and any specific design preferences.
96
+
97
+
98
+ Project Planning
99
+
100
+ Then create a detailed project plan outlining the scope, timeline, and milestones.
101
+ Define the technology stack and development tools suitable for the project.
102
+
103
+
104
+ User Experience Design
105
+
106
+ Then comes the stage of Developing wireframes or prototypes to visualize the website's
107
+ structure and layout. We create a custom design that aligns with the brand identity
108
+ and user experience goals.
109
+
110
+
111
+ Development
112
+
113
+ After getting Sign-off on Design from Client, we break the requirements into Sprints
114
+ on Agile Methodology, and start developing them.'
115
+ - 'After a transformative scuba dive in the Maldives, Mayank Maggon made a pivotal
116
+ decision to depart from the corporate ladder in December 2016. Fueled by a clear
117
+ vision to revolutionize the digital landscape, Mayank set out to leverage the
118
+ best technology ingredients, crafting custom applications and digital ecosystems
119
+ tailored to clients'' specific needs, limitations, and budgets.
120
+
121
+
122
+ However, this solo journey was not without its challenges. Mayank had to initiate
123
+ the revenue engine by offering corporate trainings and conducting online batches
124
+ for tech training across the USA. He also undertook small projects and subcontracted
125
+ modules of larger projects for clients in the US, UK, and India. It was only after
126
+ this initial groundwork that Mayank was able to hire a group of interns, whom
127
+ he meticulously trained and groomed to prepare them for handling Enterprise Level
128
+ Applications. This journey reflects Mayank''s resilience, determination, and entrepreneurial
129
+ spirit in building TechChefz Digital from the ground up.
130
+
131
+
132
+ With a passion for innovation and a relentless drive for excellence, Mayank has
133
+ steered TechChefz Digital through strategic partnerships, groundbreaking projects,
134
+ and exponential growth. His leadership has been instrumental in shaping TechChefz
135
+ Digital into a leading force in the digital transformation arena, inspiring a
136
+ culture of innovation and excellence that continues to propel the company forward.'
137
+ - 'Our Solutions
138
+
139
+ Strategy & Digital Transformation
140
+
141
+ Innovate via digital transformation, modernize tech, craft product strategies,
142
+ enhance customer experiences, optimize data analytics, transition to cloud for
143
+ growth and efficiency
144
+
145
+
146
+ Product Engineering & Custom Development
147
+
148
+ Providing product development, enterprise web and mobile development, microservices
149
+ integrations, quality engineering, and application support services to drive innovation
150
+ and enhance operational efficiency.'
151
+ - source_sentence: What kind of data do you leverage for AI solutions?
152
+ sentences:
153
+ - 'In the Introducing the world of Global Insurance Firm, we crafted Effective Solutions
154
+ for Complex Problems and delieverd a comprehensive Website Development, Production
155
+ Support & Managed Services, we optimized customer journeys, integrate analytics,
156
+ CRM, ERP, and third-party applications, and implement cutting-edge technologies
157
+ for enhanced performance and efficiency
158
+
159
+ and achievied 200% Reduction in operational time & effort managing content & experience,
160
+ 70% Reduction in Deployment Errors and Downtime, 2.5X Customer Engagement, Conversion
161
+ & Retention'
162
+ - 'Why do we need Microservices ?
163
+
164
+ Instead of building a monolithic application where all functionalities are tightly
165
+ integrated, microservices break down the system into modular and loosely coupled
166
+ services.
167
+
168
+
169
+ Scalability
170
+
171
+ Flexibility and Agility
172
+
173
+ Resilience and Fault Isolation
174
+
175
+ Technology Diversity
176
+
177
+ Continuous Delivery'
178
+ - Our AI/ML services pave the way for transformative change across industries, embodying
179
+ a client-focused approach that integrates seamlessly with human-centric innovation.
180
+ Our collaborative teams are dedicated to fostering growth, leveraging data, and
181
+ harnessing the predictive power of artificial intelligence to forge the next wave
182
+ of software excellence. We don't just deliver AI; we deliver the future.
183
+ - source_sentence: What do you guys do for digital strategy?
184
+ sentences:
185
+ - " What we do\n\nDigital Strategy\nCreating digital frameworks that transform\
186
+ \ your digital enterprise and produce a return on investment.\n\nPlatform Selection\n\
187
+ Helping you select the optimal digital experience, commerce, cloud and marketing\
188
+ \ platform for your enterprise.\n\nPlatform Builds\nDeploying next-gen scalable\
189
+ \ and agile enterprise digital platforms, along with multi-platform integrations.\n\
190
+ \nProduct Builds\nHelp you ideate, strategize, and engineer your product with\
191
+ \ help of our enterprise frameworks \n\nTeam Augmentation\nHelp you scale up and\
192
+ \ augment your existing team to solve your hiring challenges with our easy to\
193
+ \ deploy staff augmentation offerings .\nManaged Services\nOperate and monitor\
194
+ \ your business-critical applications, data, and IT workloads, along with Application\
195
+ \ maintenance and operations\n"
196
+ - "Introducing the world of\nGlobal Hospitality Firm\n\nIn this project, We focused\
197
+ \ on strategizing CX, diverse platform dev, travel booking, indemnity journeys,\
198
+ \ digital community, and managed services enhance travel experience and operational\
199
+ \ efficiency. \nStrategizing & defining the Customer Experience across business\
200
+ \ units and respective products / services,\nPlatform Development and Integrations\
201
+ \ across different tech stacks - Drupal, Magento, MERN, Microservices, Canvas\
202
+ \ LMS, OKTA SSO, AWS based Cloud Infrastructure, Build Automation\nTravel Packages\
203
+ \ Booking Platform with payments, subscriptions, real time booking, etc\nIndemnity\
204
+ \ & Self-Service Journeys\n\nAnd we achieved, 100% Improvement in Marketing Content,\
205
+ \ Real Time Prices & Inventories delivery. 80% Increase in Customer Retention,175%\
206
+ \ Increase in Partner & Vendor Operational Efficiency"
207
+ - 'Introducing the world of General Insurance Firm
208
+
209
+ In this project, we implemented Digital Solution and Implementation with Headless
210
+ Drupal as the CMS, and lightweight React JS (Next JS SSR on Node JS) with the
211
+ following features:
212
+
213
+ PWA & AMP based Web Pages
214
+
215
+ Page Speed Optimization
216
+
217
+ Reusable and scalable React JS / Next JS Templates and Components
218
+
219
+ Headless Drupal CMS with Content & Experience management, approval workflows,
220
+ etc for seamless collaboration between the business and marketing teams
221
+
222
+ Minimalistic Buy and Renewal Journeys for various products, with API integrations
223
+ and adherence to data compliances
224
+
225
+
226
+ We achieved 250% Reduction in Operational Time and Effort in managing the Content
227
+ & Experience for Buy & renew Journeys,220% Reduction in Customer Drops during
228
+ buy and renewal journeys, 300% Reduction in bounce rate on policy landing and
229
+ campaign pages'
230
+ pipeline_tag: sentence-similarity
231
+ library_name: sentence-transformers
232
+ metrics:
233
+ - cosine_accuracy@1
234
+ - cosine_accuracy@3
235
+ - cosine_accuracy@5
236
+ - cosine_accuracy@10
237
+ - cosine_precision@1
238
+ - cosine_precision@3
239
+ - cosine_precision@5
240
+ - cosine_precision@10
241
+ - cosine_recall@1
242
+ - cosine_recall@3
243
+ - cosine_recall@5
244
+ - cosine_recall@10
245
+ - cosine_ndcg@10
246
+ - cosine_mrr@10
247
+ - cosine_map@100
248
+ model-index:
249
+ - name: BGE base Financial Matryoshka
250
+ results:
251
+ - task:
252
+ type: information-retrieval
253
+ name: Information Retrieval
254
+ dataset:
255
+ name: dim 768
256
+ type: dim_768
257
+ metrics:
258
+ - type: cosine_accuracy@1
259
+ value: 0.18666666666666668
260
+ name: Cosine Accuracy@1
261
+ - type: cosine_accuracy@3
262
+ value: 0.5866666666666667
263
+ name: Cosine Accuracy@3
264
+ - type: cosine_accuracy@5
265
+ value: 0.68
266
+ name: Cosine Accuracy@5
267
+ - type: cosine_accuracy@10
268
+ value: 0.8
269
+ name: Cosine Accuracy@10
270
+ - type: cosine_precision@1
271
+ value: 0.18666666666666668
272
+ name: Cosine Precision@1
273
+ - type: cosine_precision@3
274
+ value: 0.19555555555555554
275
+ name: Cosine Precision@3
276
+ - type: cosine_precision@5
277
+ value: 0.13599999999999998
278
+ name: Cosine Precision@5
279
+ - type: cosine_precision@10
280
+ value: 0.07999999999999997
281
+ name: Cosine Precision@10
282
+ - type: cosine_recall@1
283
+ value: 0.18666666666666668
284
+ name: Cosine Recall@1
285
+ - type: cosine_recall@3
286
+ value: 0.5866666666666667
287
+ name: Cosine Recall@3
288
+ - type: cosine_recall@5
289
+ value: 0.68
290
+ name: Cosine Recall@5
291
+ - type: cosine_recall@10
292
+ value: 0.8
293
+ name: Cosine Recall@10
294
+ - type: cosine_ndcg@10
295
+ value: 0.48942651032647805
296
+ name: Cosine Ndcg@10
297
+ - type: cosine_mrr@10
298
+ value: 0.38962962962962955
299
+ name: Cosine Mrr@10
300
+ - type: cosine_map@100
301
+ value: 0.398026376123124
302
+ name: Cosine Map@100
303
+ - task:
304
+ type: information-retrieval
305
+ name: Information Retrieval
306
+ dataset:
307
+ name: dim 512
308
+ type: dim_512
309
+ metrics:
310
+ - type: cosine_accuracy@1
311
+ value: 0.24
312
+ name: Cosine Accuracy@1
313
+ - type: cosine_accuracy@3
314
+ value: 0.5733333333333334
315
+ name: Cosine Accuracy@3
316
+ - type: cosine_accuracy@5
317
+ value: 0.6533333333333333
318
+ name: Cosine Accuracy@5
319
+ - type: cosine_accuracy@10
320
+ value: 0.8
321
+ name: Cosine Accuracy@10
322
+ - type: cosine_precision@1
323
+ value: 0.24
324
+ name: Cosine Precision@1
325
+ - type: cosine_precision@3
326
+ value: 0.1911111111111111
327
+ name: Cosine Precision@3
328
+ - type: cosine_precision@5
329
+ value: 0.13066666666666663
330
+ name: Cosine Precision@5
331
+ - type: cosine_precision@10
332
+ value: 0.07999999999999997
333
+ name: Cosine Precision@10
334
+ - type: cosine_recall@1
335
+ value: 0.24
336
+ name: Cosine Recall@1
337
+ - type: cosine_recall@3
338
+ value: 0.5733333333333334
339
+ name: Cosine Recall@3
340
+ - type: cosine_recall@5
341
+ value: 0.6533333333333333
342
+ name: Cosine Recall@5
343
+ - type: cosine_recall@10
344
+ value: 0.8
345
+ name: Cosine Recall@10
346
+ - type: cosine_ndcg@10
347
+ value: 0.4991793077336057
348
+ name: Cosine Ndcg@10
349
+ - type: cosine_mrr@10
350
+ value: 0.4047195767195766
351
+ name: Cosine Mrr@10
352
+ - type: cosine_map@100
353
+ value: 0.4124023465759078
354
+ name: Cosine Map@100
355
+ - task:
356
+ type: information-retrieval
357
+ name: Information Retrieval
358
+ dataset:
359
+ name: dim 256
360
+ type: dim_256
361
+ metrics:
362
+ - type: cosine_accuracy@1
363
+ value: 0.21333333333333335
364
+ name: Cosine Accuracy@1
365
+ - type: cosine_accuracy@3
366
+ value: 0.5466666666666666
367
+ name: Cosine Accuracy@3
368
+ - type: cosine_accuracy@5
369
+ value: 0.6266666666666667
370
+ name: Cosine Accuracy@5
371
+ - type: cosine_accuracy@10
372
+ value: 0.7466666666666667
373
+ name: Cosine Accuracy@10
374
+ - type: cosine_precision@1
375
+ value: 0.21333333333333335
376
+ name: Cosine Precision@1
377
+ - type: cosine_precision@3
378
+ value: 0.1822222222222222
379
+ name: Cosine Precision@3
380
+ - type: cosine_precision@5
381
+ value: 0.12533333333333332
382
+ name: Cosine Precision@5
383
+ - type: cosine_precision@10
384
+ value: 0.07466666666666665
385
+ name: Cosine Precision@10
386
+ - type: cosine_recall@1
387
+ value: 0.21333333333333335
388
+ name: Cosine Recall@1
389
+ - type: cosine_recall@3
390
+ value: 0.5466666666666666
391
+ name: Cosine Recall@3
392
+ - type: cosine_recall@5
393
+ value: 0.6266666666666667
394
+ name: Cosine Recall@5
395
+ - type: cosine_recall@10
396
+ value: 0.7466666666666667
397
+ name: Cosine Recall@10
398
+ - type: cosine_ndcg@10
399
+ value: 0.4717065825983648
400
+ name: Cosine Ndcg@10
401
+ - type: cosine_mrr@10
402
+ value: 0.38359259259259254
403
+ name: Cosine Mrr@10
404
+ - type: cosine_map@100
405
+ value: 0.39417579048787715
406
+ name: Cosine Map@100
407
+ - task:
408
+ type: information-retrieval
409
+ name: Information Retrieval
410
+ dataset:
411
+ name: dim 128
412
+ type: dim_128
413
+ metrics:
414
+ - type: cosine_accuracy@1
415
+ value: 0.21333333333333335
416
+ name: Cosine Accuracy@1
417
+ - type: cosine_accuracy@3
418
+ value: 0.52
419
+ name: Cosine Accuracy@3
420
+ - type: cosine_accuracy@5
421
+ value: 0.5733333333333334
422
+ name: Cosine Accuracy@5
423
+ - type: cosine_accuracy@10
424
+ value: 0.7066666666666667
425
+ name: Cosine Accuracy@10
426
+ - type: cosine_precision@1
427
+ value: 0.21333333333333335
428
+ name: Cosine Precision@1
429
+ - type: cosine_precision@3
430
+ value: 0.1733333333333333
431
+ name: Cosine Precision@3
432
+ - type: cosine_precision@5
433
+ value: 0.11466666666666667
434
+ name: Cosine Precision@5
435
+ - type: cosine_precision@10
436
+ value: 0.07066666666666666
437
+ name: Cosine Precision@10
438
+ - type: cosine_recall@1
439
+ value: 0.21333333333333335
440
+ name: Cosine Recall@1
441
+ - type: cosine_recall@3
442
+ value: 0.52
443
+ name: Cosine Recall@3
444
+ - type: cosine_recall@5
445
+ value: 0.5733333333333334
446
+ name: Cosine Recall@5
447
+ - type: cosine_recall@10
448
+ value: 0.7066666666666667
449
+ name: Cosine Recall@10
450
+ - type: cosine_ndcg@10
451
+ value: 0.44415760022208445
452
+ name: Cosine Ndcg@10
453
+ - type: cosine_mrr@10
454
+ value: 0.36086772486772484
455
+ name: Cosine Mrr@10
456
+ - type: cosine_map@100
457
+ value: 0.37364447853598953
458
+ name: Cosine Map@100
459
+ - task:
460
+ type: information-retrieval
461
+ name: Information Retrieval
462
+ dataset:
463
+ name: dim 64
464
+ type: dim_64
465
+ metrics:
466
+ - type: cosine_accuracy@1
467
+ value: 0.14666666666666667
468
+ name: Cosine Accuracy@1
469
+ - type: cosine_accuracy@3
470
+ value: 0.4
471
+ name: Cosine Accuracy@3
472
+ - type: cosine_accuracy@5
473
+ value: 0.5066666666666667
474
+ name: Cosine Accuracy@5
475
+ - type: cosine_accuracy@10
476
+ value: 0.6133333333333333
477
+ name: Cosine Accuracy@10
478
+ - type: cosine_precision@1
479
+ value: 0.14666666666666667
480
+ name: Cosine Precision@1
481
+ - type: cosine_precision@3
482
+ value: 0.13333333333333333
483
+ name: Cosine Precision@3
484
+ - type: cosine_precision@5
485
+ value: 0.10133333333333334
486
+ name: Cosine Precision@5
487
+ - type: cosine_precision@10
488
+ value: 0.06133333333333333
489
+ name: Cosine Precision@10
490
+ - type: cosine_recall@1
491
+ value: 0.14666666666666667
492
+ name: Cosine Recall@1
493
+ - type: cosine_recall@3
494
+ value: 0.4
495
+ name: Cosine Recall@3
496
+ - type: cosine_recall@5
497
+ value: 0.5066666666666667
498
+ name: Cosine Recall@5
499
+ - type: cosine_recall@10
500
+ value: 0.6133333333333333
501
+ name: Cosine Recall@10
502
+ - type: cosine_ndcg@10
503
+ value: 0.3595031317594935
504
+ name: Cosine Ndcg@10
505
+ - type: cosine_mrr@10
506
+ value: 0.27981481481481474
507
+ name: Cosine Mrr@10
508
+ - type: cosine_map@100
509
+ value: 0.29776557642203677
510
+ name: Cosine Map@100
511
+ ---
512
+
513
+ # BGE base Financial Matryoshka
514
+
515
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
516
+
517
+ ## Model Details
518
+
519
+ ### Model Description
520
+ - **Model Type:** Sentence Transformer
521
+ - **Base model:** [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) <!-- at revision a5beb1e3e68b9ab74eb54cfd186867f64f240e1a -->
522
+ - **Maximum Sequence Length:** 512 tokens
523
+ - **Output Dimensionality:** 768 dimensions
524
+ - **Similarity Function:** Cosine Similarity
525
+ <!-- - **Training Dataset:** Unknown -->
526
+ - **Language:** en
527
+ - **License:** apache-2.0
528
+
529
+ ### Model Sources
530
+
531
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
532
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
533
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
534
+
535
+ ### Full Model Architecture
536
+
537
+ ```
538
+ SentenceTransformer(
539
+ (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel
540
+ (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
541
+ (2): Normalize()
542
+ )
543
+ ```
544
+
545
+ ## Usage
546
+
547
+ ### Direct Usage (Sentence Transformers)
548
+
549
+ First install the Sentence Transformers library:
550
+
551
+ ```bash
552
+ pip install -U sentence-transformers
553
+ ```
554
+
555
+ Then you can load this model and run inference.
556
+ ```python
557
+ from sentence_transformers import SentenceTransformer
558
+
559
+ # Download from the 🤗 Hub
560
+ model = SentenceTransformer("Shashwat13333/bge-base-en-v1.5_v1")
561
+ # Run inference
562
+ sentences = [
563
+ 'What do you guys do for digital strategy?',
564
+ ' What we do\n\nDigital Strategy\nCreating digital frameworks that transform your digital enterprise and produce a return on investment.\n\nPlatform Selection\nHelping you select the optimal digital experience, commerce, cloud and marketing platform for your enterprise.\n\nPlatform Builds\nDeploying next-gen scalable and agile enterprise digital platforms, along with multi-platform integrations.\n\nProduct Builds\nHelp you ideate, strategize, and engineer your product with help of our enterprise frameworks \n\nTeam Augmentation\nHelp you scale up and augment your existing team to solve your hiring challenges with our easy to deploy staff augmentation offerings .\nManaged Services\nOperate and monitor your business-critical applications, data, and IT workloads, along with Application maintenance and operations\n',
565
+ 'Introducing the world of General Insurance Firm\nIn this project, we implemented Digital Solution and Implementation with Headless Drupal as the CMS, and lightweight React JS (Next JS SSR on Node JS) with the following features:\nPWA & AMP based Web Pages\nPage Speed Optimization\nReusable and scalable React JS / Next JS Templates and Components\nHeadless Drupal CMS with Content & Experience management, approval workflows, etc for seamless collaboration between the business and marketing teams\nMinimalistic Buy and Renewal Journeys for various products, with API integrations and adherence to data compliances\n\nWe achieved 250% Reduction in Operational Time and Effort in managing the Content & Experience for Buy & renew Journeys,220% Reduction in Customer Drops during buy and renewal journeys, 300% Reduction in bounce rate on policy landing and campaign pages',
566
+ ]
567
+ embeddings = model.encode(sentences)
568
+ print(embeddings.shape)
569
+ # [3, 768]
570
+
571
+ # Get the similarity scores for the embeddings
572
+ similarities = model.similarity(embeddings, embeddings)
573
+ print(similarities.shape)
574
+ # [3, 3]
575
+ ```
576
+
577
+ <!--
578
+ ### Direct Usage (Transformers)
579
+
580
+ <details><summary>Click to see the direct usage in Transformers</summary>
581
+
582
+ </details>
583
+ -->
584
+
585
+ <!--
586
+ ### Downstream Usage (Sentence Transformers)
587
+
588
+ You can finetune this model on your own dataset.
589
+
590
+ <details><summary>Click to expand</summary>
591
+
592
+ </details>
593
+ -->
594
+
595
+ <!--
596
+ ### Out-of-Scope Use
597
+
598
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
599
+ -->
600
+
601
+ ## Evaluation
602
+
603
+ ### Metrics
604
+
605
+ #### Information Retrieval
606
+
607
+ * Datasets: `dim_768`, `dim_512`, `dim_256`, `dim_128` and `dim_64`
608
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
609
+
610
+ | Metric | dim_768 | dim_512 | dim_256 | dim_128 | dim_64 |
611
+ |:--------------------|:-----------|:-----------|:-----------|:-----------|:-----------|
612
+ | cosine_accuracy@1 | 0.1867 | 0.24 | 0.2133 | 0.2133 | 0.1467 |
613
+ | cosine_accuracy@3 | 0.5867 | 0.5733 | 0.5467 | 0.52 | 0.4 |
614
+ | cosine_accuracy@5 | 0.68 | 0.6533 | 0.6267 | 0.5733 | 0.5067 |
615
+ | cosine_accuracy@10 | 0.8 | 0.8 | 0.7467 | 0.7067 | 0.6133 |
616
+ | cosine_precision@1 | 0.1867 | 0.24 | 0.2133 | 0.2133 | 0.1467 |
617
+ | cosine_precision@3 | 0.1956 | 0.1911 | 0.1822 | 0.1733 | 0.1333 |
618
+ | cosine_precision@5 | 0.136 | 0.1307 | 0.1253 | 0.1147 | 0.1013 |
619
+ | cosine_precision@10 | 0.08 | 0.08 | 0.0747 | 0.0707 | 0.0613 |
620
+ | cosine_recall@1 | 0.1867 | 0.24 | 0.2133 | 0.2133 | 0.1467 |
621
+ | cosine_recall@3 | 0.5867 | 0.5733 | 0.5467 | 0.52 | 0.4 |
622
+ | cosine_recall@5 | 0.68 | 0.6533 | 0.6267 | 0.5733 | 0.5067 |
623
+ | cosine_recall@10 | 0.8 | 0.8 | 0.7467 | 0.7067 | 0.6133 |
624
+ | **cosine_ndcg@10** | **0.4894** | **0.4992** | **0.4717** | **0.4442** | **0.3595** |
625
+ | cosine_mrr@10 | 0.3896 | 0.4047 | 0.3836 | 0.3609 | 0.2798 |
626
+ | cosine_map@100 | 0.398 | 0.4124 | 0.3942 | 0.3736 | 0.2978 |
627
+
628
+ <!--
629
+ ## Bias, Risks and Limitations
630
+
631
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
632
+ -->
633
+
634
+ <!--
635
+ ### Recommendations
636
+
637
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
638
+ -->
639
+
640
+ ## Training Details
641
+
642
+ ### Training Dataset
643
+
644
+ #### Unnamed Dataset
645
+
646
+ * Size: 150 training samples
647
+ * Columns: <code>anchor</code> and <code>positive</code>
648
+ * Approximate statistics based on the first 150 samples:
649
+ | | anchor | positive |
650
+ |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
651
+ | type | string | string |
652
+ | details | <ul><li>min: 7 tokens</li><li>mean: 12.15 tokens</li><li>max: 20 tokens</li></ul> | <ul><li>min: 20 tokens</li><li>mean: 126.17 tokens</li><li>max: 378 tokens</li></ul> |
653
+ * Samples:
654
+ | anchor | positive |
655
+ |:--------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
656
+ | <code>Is it hard to move old systems to the cloud?</code> | <code>We offer custom software development, digital marketing strategies, and tailored solutions to drive tangible results for your business. Our expert team combines technical prowess with industry insights to propel your business forward in the digital landscape.<br><br>"Engage, analyze & target your customers<br>Digital transformation enables you to interact with customers across multiple channels, providing personalized experiences. This could include social media engagement, interactive websites, and mobile apps." "Empower your employees & partners<br>The push for digital transformation has led many companies to embrace cloud solutions. However, the migration and integration of legacy systems into the cloud often present challenges." "Optimize & automate your operations<br>The push for digital transformation has led many companies to embrace cloud solutions. However, the migration and integration of legacy systems into the cloud often present challenges." "Transform your products<br>The push for digi...</code> |
657
+ | <code>What benefits does marketing automation offer for time management?</code> | <code>Our MarTech capabilities<br><br>Personalization<br>Involves tailoring marketing messages and experiences to individual customers. It enhances customer engagement, loyalty, and ultimately, conversion rates.<br><br>Marketing Automation<br>Marketing automation streamlines repetitive tasks such as email marketing, lead nurturing, and social media posting. It improves efficiency, saves time, and ensures timely communication with customers.<br><br>Customer Relationship Management<br>CRM systems help manage interactions with current and potential customers. They store customer data, track interactions, and facilitate communication, improving customer retention.</code> |
658
+ | <code>do you track customer behavior?</code> | <code>How can your recommendation engines improve our business?<br>Our recommendation engines are designed to analyze customer behavior and preferences to deliver personalized suggestions, enhancing user experience, increasing sales, and boosting customer retention.</code> |
659
+ * Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
660
+ ```json
661
+ {
662
+ "loss": "MultipleNegativesRankingLoss",
663
+ "matryoshka_dims": [
664
+ 768,
665
+ 512,
666
+ 256,
667
+ 128,
668
+ 64
669
+ ],
670
+ "matryoshka_weights": [
671
+ 1,
672
+ 1,
673
+ 1,
674
+ 1,
675
+ 1
676
+ ],
677
+ "n_dims_per_step": -1
678
+ }
679
+ ```
680
+
681
+ ### Training Hyperparameters
682
+ #### Non-Default Hyperparameters
683
+
684
+ - `eval_strategy`: epoch
685
+ - `gradient_accumulation_steps`: 4
686
+ - `learning_rate`: 1e-05
687
+ - `weight_decay`: 0.01
688
+ - `num_train_epochs`: 4
689
+ - `lr_scheduler_type`: cosine
690
+ - `warmup_ratio`: 0.1
691
+ - `fp16`: True
692
+ - `load_best_model_at_end`: True
693
+ - `optim`: adamw_torch_fused
694
+ - `push_to_hub`: True
695
+ - `hub_model_id`: Shashwat13333/bge-base-en-v1.5_v1
696
+ - `push_to_hub_model_id`: bge-base-en-v1.5_v1
697
+ - `batch_sampler`: no_duplicates
698
+
699
+ #### All Hyperparameters
700
+ <details><summary>Click to expand</summary>
701
+
702
+ - `overwrite_output_dir`: False
703
+ - `do_predict`: False
704
+ - `eval_strategy`: epoch
705
+ - `prediction_loss_only`: True
706
+ - `per_device_train_batch_size`: 8
707
+ - `per_device_eval_batch_size`: 8
708
+ - `per_gpu_train_batch_size`: None
709
+ - `per_gpu_eval_batch_size`: None
710
+ - `gradient_accumulation_steps`: 4
711
+ - `eval_accumulation_steps`: None
712
+ - `torch_empty_cache_steps`: None
713
+ - `learning_rate`: 1e-05
714
+ - `weight_decay`: 0.01
715
+ - `adam_beta1`: 0.9
716
+ - `adam_beta2`: 0.999
717
+ - `adam_epsilon`: 1e-08
718
+ - `max_grad_norm`: 1.0
719
+ - `num_train_epochs`: 4
720
+ - `max_steps`: -1
721
+ - `lr_scheduler_type`: cosine
722
+ - `lr_scheduler_kwargs`: {}
723
+ - `warmup_ratio`: 0.1
724
+ - `warmup_steps`: 0
725
+ - `log_level`: passive
726
+ - `log_level_replica`: warning
727
+ - `log_on_each_node`: True
728
+ - `logging_nan_inf_filter`: True
729
+ - `save_safetensors`: True
730
+ - `save_on_each_node`: False
731
+ - `save_only_model`: False
732
+ - `restore_callback_states_from_checkpoint`: False
733
+ - `no_cuda`: False
734
+ - `use_cpu`: False
735
+ - `use_mps_device`: False
736
+ - `seed`: 42
737
+ - `data_seed`: None
738
+ - `jit_mode_eval`: False
739
+ - `use_ipex`: False
740
+ - `bf16`: False
741
+ - `fp16`: True
742
+ - `fp16_opt_level`: O1
743
+ - `half_precision_backend`: auto
744
+ - `bf16_full_eval`: False
745
+ - `fp16_full_eval`: False
746
+ - `tf32`: None
747
+ - `local_rank`: 0
748
+ - `ddp_backend`: None
749
+ - `tpu_num_cores`: None
750
+ - `tpu_metrics_debug`: False
751
+ - `debug`: []
752
+ - `dataloader_drop_last`: False
753
+ - `dataloader_num_workers`: 0
754
+ - `dataloader_prefetch_factor`: None
755
+ - `past_index`: -1
756
+ - `disable_tqdm`: False
757
+ - `remove_unused_columns`: True
758
+ - `label_names`: None
759
+ - `load_best_model_at_end`: True
760
+ - `ignore_data_skip`: False
761
+ - `fsdp`: []
762
+ - `fsdp_min_num_params`: 0
763
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
764
+ - `fsdp_transformer_layer_cls_to_wrap`: None
765
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
766
+ - `deepspeed`: None
767
+ - `label_smoothing_factor`: 0.0
768
+ - `optim`: adamw_torch_fused
769
+ - `optim_args`: None
770
+ - `adafactor`: False
771
+ - `group_by_length`: False
772
+ - `length_column_name`: length
773
+ - `ddp_find_unused_parameters`: None
774
+ - `ddp_bucket_cap_mb`: None
775
+ - `ddp_broadcast_buffers`: False
776
+ - `dataloader_pin_memory`: True
777
+ - `dataloader_persistent_workers`: False
778
+ - `skip_memory_metrics`: True
779
+ - `use_legacy_prediction_loop`: False
780
+ - `push_to_hub`: True
781
+ - `resume_from_checkpoint`: None
782
+ - `hub_model_id`: Shashwat13333/bge-base-en-v1.5_v1
783
+ - `hub_strategy`: every_save
784
+ - `hub_private_repo`: None
785
+ - `hub_always_push`: False
786
+ - `gradient_checkpointing`: False
787
+ - `gradient_checkpointing_kwargs`: None
788
+ - `include_inputs_for_metrics`: False
789
+ - `include_for_metrics`: []
790
+ - `eval_do_concat_batches`: True
791
+ - `fp16_backend`: auto
792
+ - `push_to_hub_model_id`: bge-base-en-v1.5_v1
793
+ - `push_to_hub_organization`: None
794
+ - `mp_parameters`:
795
+ - `auto_find_batch_size`: False
796
+ - `full_determinism`: False
797
+ - `torchdynamo`: None
798
+ - `ray_scope`: last
799
+ - `ddp_timeout`: 1800
800
+ - `torch_compile`: False
801
+ - `torch_compile_backend`: None
802
+ - `torch_compile_mode`: None
803
+ - `dispatch_batches`: None
804
+ - `split_batches`: None
805
+ - `include_tokens_per_second`: False
806
+ - `include_num_input_tokens_seen`: False
807
+ - `neftune_noise_alpha`: None
808
+ - `optim_target_modules`: None
809
+ - `batch_eval_metrics`: False
810
+ - `eval_on_start`: False
811
+ - `use_liger_kernel`: False
812
+ - `eval_use_gather_object`: False
813
+ - `average_tokens_across_devices`: False
814
+ - `prompts`: None
815
+ - `batch_sampler`: no_duplicates
816
+ - `multi_dataset_batch_sampler`: proportional
817
+
818
+ </details>
819
+
820
+ ### Training Logs
821
+ | Epoch | Step | Training Loss | dim_768_cosine_ndcg@10 | dim_512_cosine_ndcg@10 | dim_256_cosine_ndcg@10 | dim_128_cosine_ndcg@10 | dim_64_cosine_ndcg@10 |
822
+ |:----------:|:------:|:-------------:|:----------------------:|:----------------------:|:----------------------:|:----------------------:|:---------------------:|
823
+ | 0.2105 | 1 | 22.6183 | - | - | - | - | - |
824
+ | 0.8421 | 4 | - | 0.4602 | 0.4392 | 0.4498 | 0.4162 | 0.3698 |
825
+ | 1.2105 | 5 | 20.549 | - | - | - | - | - |
826
+ | 1.8421 | 8 | - | 0.5047 | 0.4304 | 0.4538 | 0.4202 | 0.3458 |
827
+ | 2.4211 | 10 | 17.664 | - | - | - | - | - |
828
+ | **2.8421** | **12** | **-** | **0.482** | **0.4618** | **0.4658** | **0.4537** | **0.3496** |
829
+ | 3.6316 | 15 | 14.6735 | - | - | - | - | - |
830
+ | 3.8421 | 16 | - | 0.4894 | 0.4992 | 0.4717 | 0.4442 | 0.3595 |
831
+
832
+ * The bold row denotes the saved checkpoint.
833
+
834
+ ### Framework Versions
835
+ - Python: 3.11.11
836
+ - Sentence Transformers: 3.4.1
837
+ - Transformers: 4.48.2
838
+ - PyTorch: 2.5.1+cu124
839
+ - Accelerate: 1.3.0
840
+ - Datasets: 3.2.0
841
+ - Tokenizers: 0.21.0
842
+
843
+ ## Citation
844
+
845
+ ### BibTeX
846
+
847
+ #### Sentence Transformers
848
+ ```bibtex
849
+ @inproceedings{reimers-2019-sentence-bert,
850
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
851
+ author = "Reimers, Nils and Gurevych, Iryna",
852
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
853
+ month = "11",
854
+ year = "2019",
855
+ publisher = "Association for Computational Linguistics",
856
+ url = "https://arxiv.org/abs/1908.10084",
857
+ }
858
+ ```
859
+
860
+ #### MatryoshkaLoss
861
+ ```bibtex
862
+ @misc{kusupati2024matryoshka,
863
+ title={Matryoshka Representation Learning},
864
+ author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
865
+ year={2024},
866
+ eprint={2205.13147},
867
+ archivePrefix={arXiv},
868
+ primaryClass={cs.LG}
869
+ }
870
+ ```
871
+
872
+ #### MultipleNegativesRankingLoss
873
+ ```bibtex
874
+ @misc{henderson2017efficient,
875
+ title={Efficient Natural Language Response Suggestion for Smart Reply},
876
+ author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
877
+ year={2017},
878
+ eprint={1705.00652},
879
+ archivePrefix={arXiv},
880
+ primaryClass={cs.CL}
881
+ }
882
+ ```
883
+
884
+ <!--
885
+ ## Glossary
886
+
887
+ *Clearly define terms in order to be accessible across audiences.*
888
+ -->
889
+
890
+ <!--
891
+ ## Model Card Authors
892
+
893
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
894
+ -->
895
+
896
+ <!--
897
+ ## Model Card Contact
898
+
899
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
900
+ -->
config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "3.4.1",
4
+ "transformers": "4.48.2",
5
+ "pytorch": "2.5.1+cu124"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": "cosine"
10
+ }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:7db8bd4456db9f5e625ce439c1f72135d3cba5e8fe5055ec468a33942a17e785
3
  size 437951328
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5b3c73811d23d463d9074b8fcc0bc000896f702fd32b97e23bfa5c1e3a75cad7
3
  size 437951328
modules.json ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ },
14
+ {
15
+ "idx": 2,
16
+ "name": "2",
17
+ "path": "2_Normalize",
18
+ "type": "sentence_transformers.models.Normalize"
19
+ }
20
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 512,
3
+ "do_lower_case": true
4
+ }